>>/dobrochan/2646@603
Thank you, but I am familiar with ML at the basic level, now I am looking at NLP in more detail.
Also, such a request. What are the pre-trained generative models for the Russian language to use for certain texts? In GPT-2, there are 1.5 billion parameters, which is a bit fatty to learn more in Kolaba or on a beggar-out, I would like something simple. And BERT does not generate text, but guesses masked tokens, this is different.