>>/2055/
>> So, Father, where to download full ChatGPT sockets and original datasets? Or did you put yourself in your pants again?
> Technique, copypaste, there are several models, *dirty panties dance intensifies*
Well, Father, it's time for you to change your pants.
>> Machine learning is more reproducible and open than academic science.
Well, how can you not believe a schizophrenic with Jewish conspiracy mania and neural network obsession who soils his own pants?
>> And the article came out.
And how does it feel, sick man, that you proudly flung up your bet pants and then put a pile into them and tried to escape under false pretexts?
>> That is, she may, with some probability, give the correct answer, but not perform this detective work.
As well as a broken watch, which can with some probability show the right time. Do you have any numerical estimates of this probability in general? Or will it all end again with the twerking of a dirty fillet to the fervent cries of the coming neuro-goddess?
>> Everyone who is even slightly interested in science and mathematics has heard about Tao, this is not just a funny Asian with a high IQ, and not even just a Fields Prize winner, but a person who is turned to for advice by other people at the level of the Fields Prize.
Have you ever read Perelman's correspondence with Tao? Isn’t it time to just scale everything up?
>> Speaking of scaling. It is shown that it is possible to compress neural networks up to 4 bits per parameter with small losses of accuracy. This means that the same ChatGPT will fit into 87 gigabytes – it could be launched in the four 3090s.
On the system for 600k rubles.
Time. Time! Soon, dear, you will be the head of the scaling division!