AI art is derivative work, and claim that the authors of the works used to train the model shall have partial copyright over it too.
To me this is a potential can of worms. Humans can study and mimic art from other humans. It's a fundamental part of the learning process.
My understanding of modern AI image generation is that it's much more advanced than something like music sampling, it's not just an advanced cut and paste machine mashing art works together. How would you ever determine how much of a particular artists training data was used in the output?
If I create my own unique image in Jackson Pollock's style I own the entirety of that copyright, with Pollock owning nothing, no matter that everyone would recognize the resemblance in style. Why is AI different?
It feels like expanding the definition of derivative works is more likely to result in companies going after real artists who mimic or are otherwise inspired by Disney/Pixar/etc and attempting to claim partial copyright rather than protecting real artists from AI ripoffs.
My experience from watching lockpicking lawyer is that locks are just social niceties that tell others 'please don't go here' and have no real ability to stop anyone who doesn't care. Other than the owner who gets locked out by forgetting their own key of course.