I’ve no illusions about which side the AI companies are on. But what I’m afraid of is that the issue of AI training will be misused by “Big Content” (for lack of a better word) to further restrict fair use, and to raise the barrier to entry for new commercial content creators. That’s what they have always done.
Personally I am not a fan of copyright allowing creators to retain control over the use of their work, whether they are commercial creators, or creators releasing under a license like the GPL. Especially the point on derivative works and moral rights: I think that copyright should be limited to just that: the right to copy or forbid it. The author controls when and how his work may be copied, so that he can derive an income from selling copies if he so desires, but for no other purpose. But derivative works should be allowed if the derivative is enough of an original work in its own right, inspired by the original rather than copying large parts of it verbatim. Anyone wants to write “Harry Potter and the Temple of Doom” or film “Gandalf vs Predator”, fine by me. Where that leaves AI, I’m not sure. The results from AI prompts sometimes seem to be inspired original works, at other times with recognisable snippets from someone else’s work.