The world's most-visited deepfake website and another large competing site are stopping people in the UK from accessing them, days after the UK government announced a crackdown.
Mmmmmm some of the models on CivitAI, with the right workflows, could effectively create the most versatile “deep fakes” possible. You can put someone’s face on a blank canvas and tell a model specifically trained for realistic pornography to paint a whole scene around the face. The only advantage here is that doing that is kinda pointless when you can just generate any random face to match your specifications. So ultimately much less harmful so long as the user isn’t obsessing over representing a distinct living person.
Also, these are primarily still images. Some animation models exist, but that process is a lot more hit-and-miss. Overall though, I’d argue this whole use case is significantly less damaging than deep fakes due to principals.
Porn fakes are old news, yes - but what the post you replied to is talking about is “deep learning” (remember that, before the great deluge of “AI”?) fakes, which to some extent either uses generative networks to swap out or alter the face/body, or straight up generate simulacra graphics/video, as opposed to a human doing for the most part comparatively bad hack jobs with multiple sources as in the past.
I am just wondering if they are referring to citivai, which tries really hard to be seen as an art and base model/lora site.
AI generated images aren’t “deep fakes”. Deep fakes came out a long time before image gen did. You take an existing movie and swap out just the face.
Mmmmmm some of the models on CivitAI, with the right workflows, could effectively create the most versatile “deep fakes” possible. You can put someone’s face on a blank canvas and tell a model specifically trained for realistic pornography to paint a whole scene around the face. The only advantage here is that doing that is kinda pointless when you can just generate any random face to match your specifications. So ultimately much less harmful so long as the user isn’t obsessing over representing a distinct living person.
Also, these are primarily still images. Some animation models exist, but that process is a lot more hit-and-miss. Overall though, I’d argue this whole use case is significantly less damaging than deep fakes due to principals.
Porn fakes are old news, yes - but what the post you replied to is talking about is “deep learning” (remember that, before the great deluge of “AI”?) fakes, which to some extent either uses generative networks to swap out or alter the face/body, or straight up generate simulacra graphics/video, as opposed to a human doing for the most part comparatively bad hack jobs with multiple sources as in the past.
Nah, the site you’re referring to still works from the UK.