i feel kinda conflicted. on one hand i don’t want AI Corps to make money of off others’ work especially given that there is no attribution with image/language models. On the other hand, I believe no one has a real ownership over an infinitely reproduce able digital media. Its why NFT bros were laughed at with copy paste.
Obviously AI models don’t have any real creativity like humans do. But don’t humans make art by combining past experiences? If you look at an art, it will likely end up influencing the art you make. So is that really ‘stealing’?
You can see that its not just artists but also record publishers who are suing AI Corps. They want to make AI Slop themselves but don’t want others to have it.
The discourse around AI art is entirely too property brained* to ever arrive at an actual solution**, but “let’s actively commit industrial sabotage by feeding the AI poisonous nonsense that breaks it” is about as good a material praxis as is possible right now. If it weren’t for the material ramifications of giving everyone a magic slop generator that would actually be pretty good, except all that does in hellworld is enable grifters to make endless seas of low-effort slop to try to grift some money while businesses start cannibalizing themselves harder to replace support staff with dangerously wrong chatbots and artists with empty slop generators.
It’s basically the same old conflict of the new automated machine sucking absolute shit in every practical way except for scalability and cost-per-unit. If people could throw their literal shoes into the works to break those, the least people can do now is throw a hypothetical digital shoe into its digital brain.
* It genuinely doesn’t matter past the most immediate short term if corporations get their unlimited training data or have to build stables of training data that they own completely, because either way they get their proprietary slop generator and the harmful effects of AI generation continue unimpeded.
** AI generation needs to be a poison pill for an entire work’s ownability regardless of the ownership status of the generators training data, and it needs to be an exacerbating factor when used for spam or fraud.
On the other hand, I believe no one has a real ownership over an infinitely reproduce able digital media. Its why NFT bros were laughed at with copy paste.
If someone made it with their hands, they can do whatever they want with it. If they want it to make a special artwork for their friend, they have every right to get upset if they sell it and it becomes public. I don’t care about intellectual property laws. But people still have a right to do whatever they want with their work.
Anyway, AI art is a corporate tool now. It doesn’t matter if some random guy can produce original artwork or whatever. The point is that companies will still profit billions with this, and if it means restricting some random layman from creating art to preserve some control over your livelihood - which is quite measly compared to a traditional job - then so be it.
i feel kinda conflicted. on one hand i don’t want AI Corps to make money of off others’ work especially given that there is no attribution with image/language models. On the other hand, I believe no one has a real ownership over an infinitely reproduce able digital media. Its why NFT bros were laughed at with copy paste.
Obviously AI models don’t have any real creativity like humans do. But don’t humans make art by combining past experiences? If you look at an art, it will likely end up influencing the art you make. So is that really ‘stealing’?
You can see that its not just artists but also record publishers who are suing AI Corps. They want to make AI Slop themselves but don’t want others to have it.
bro, dawg, i have a whole ass kropotkin quote about how there literally is no such thing as an original thought
I’d share it with you but i’m drunk and lazy and i’ll do it in the morning
The discourse around AI art is entirely too property brained* to ever arrive at an actual solution**, but “let’s actively commit industrial sabotage by feeding the AI poisonous nonsense that breaks it” is about as good a material praxis as is possible right now. If it weren’t for the material ramifications of giving everyone a magic slop generator that would actually be pretty good, except all that does in hellworld is enable grifters to make endless seas of low-effort slop to try to grift some money while businesses start cannibalizing themselves harder to replace support staff with dangerously wrong chatbots and artists with empty slop generators.
It’s basically the same old conflict of the new automated machine sucking absolute shit in every practical way except for scalability and cost-per-unit. If people could throw their literal shoes into the works to break those, the least people can do now is throw a hypothetical digital shoe into its digital brain.
* It genuinely doesn’t matter past the most immediate short term if corporations get their unlimited training data or have to build stables of training data that they own completely, because either way they get their proprietary slop generator and the harmful effects of AI generation continue unimpeded.
** AI generation needs to be a poison pill for an entire work’s ownability regardless of the ownership status of the generators training data, and it needs to be an exacerbating factor when used for spam or fraud.
If someone made it with their hands, they can do whatever they want with it. If they want it to make a special artwork for their friend, they have every right to get upset if they sell it and it becomes public. I don’t care about intellectual property laws. But people still have a right to do whatever they want with their work.
Anyway, AI art is a corporate tool now. It doesn’t matter if some random guy can produce original artwork or whatever. The point is that companies will still profit billions with this, and if it means restricting some random layman from creating art to preserve some control over your livelihood - which is quite measly compared to a traditional job - then so be it.