I don’t fault your interpretation. There is a reason Vonnegut uses the term “spiritual” throughout the book. At least for me, I would describe my understanding of the book to have required a spiritual/moral shift before I could really understand the image being painted.
I also read God Bless You Mr. Rosewater first of the two, so maybe that colored how I interpreted Player Piano. It is a more direct argument that humans need to be cared for, independent of their economic utility.
So when I read Player Piano, it didn’t strike me as an argument against automation (which, being an engineer myself, I am entirely for), but moreso as a warning that freedom from labor doesn’t alone make a perfect life. Especially in the mid-20th century context Vonnegut was writing in, it’s an argument against the “American” style of automation, wherein you displace people from their jobs and discard them entirely. They serve no further purpose to your economy, and since your society is tightly adjoined to the economy, they serve to purpose to society…
So it’s not really a book about automation, if if I said that in my first post. It’s a book about failings of American culture, which happens to be revealed through automation. It’s about the inconsistency of a society where one’s usefulness to others is determined solely by their labor, and where that labor is constantly sought to be devalued and eliminated, and what the end of that process looks like for humans who want to find meaning in their activities.
The current assumption made by these companies is that AI training is fair use, and is therefore legal regardless of license. There are still many ongoing court cases over this, but one case was already resolved in favor or the fair use position.