Is AI stealing when it trains on copyrighted work? But that's how we humans learn to write. Like AI, we "train" by reading copyrighted books and stories written by others, and we pick up the language patterns and formations.
But, while learning, if we re-write someone's paragraph or (in the case of an artist) reproduce their drawing, then we know we're in "learning mode" and won't pass that off as our work.
I think that might be missing in AI: some over-rule which tells it that, if the current iteration is too close to the training sources, then consider it "learning material" and don't output it.