>>/8067/
By the way, an employee at stability ai has confirmed they fucked with the weights of the model after the training was done to censor it which is why women become blobs. Exactly as i had thought,i knew i wasnt the training data since it seemed to go from extremely well functioning to non coherent as soon as it had to depict a girl in almost any pose.they first make a model and then ask it make naked girls or other things they dont want. Everytime it does so the model is "punished" as in given the opposite of a reward value until it scrambles the ais ability to depict certain things. I think this is called dpo safety finetuning but other similar techniques are in use.
Ofcourse the api access that you have to pay for doesnt have this issue. Only the "open source" model they released cause they kinda promised too. The community is even talking about switching to chinese models which have now become less censored than the competition by virtue of western models becoming more "safety" trained.