284
Spurred by Teen Girls, States Move to Ban Deepfake Nudes
(www.nytimes.com)
This is a most excellent place for technology news and articles.
Still negative.
https://legalbeagle.com/8581945-illegal-pictures-people-permission.html
[...]
So even though that couple is the direct foreground subject of the image, the photographer is NOT liable for not only taking the picture, publishing the picture, but ALSO any damages the picture caused by being published. This is why the paparazzi are also protected.
In the previous post the photographer has the rights because it's their photo, not because you're giving them any rights.
Edit: Typo
Taking photos and the right for commercial use of the photos are two different things. The reason why film crews/photographers generally ask for people to sign releases is because it's not clear cut. While the US is generally more forgiving, it's not a guarantee.
More details
Right... So back to the topic discussion rather than adding extra shit... Someone taking pictures and putting it through AI... There's no problem. They own the rights to that photo and all derivative works (except for any cases where it outright violates a law, peeping tom/stalking/etc...). Public figure or not.
After that it can get gray (but I never brought sale or commercial AI use as a thing... Not sure why people assume I did). But it's quite rare where a sold picture cause a photographer problems. Even if the subjects didn't necessarily consent.
Some other countries might have problems with that and have different laws on the books. But at this point in the world it's really not hard to have a shell company in a territory/country that doesn't have such laws... Then it no longer matters again. Good like finding the photographer to sue.