AI plagiarism wouldn’t be a problem if it weren’t for intellectual copyright and capitalism. Ironically, the status quo of AI art being public domain is absolutely based, as the fruits of our stolen labor belong to us. The communists and anarchists should totally make nonprofit AI art that nobody is allowed to own. Reclaiming AI would be awesome!
Unfortunately, tech bros want to enslave all artists along with the rest of the workers, so they’ll rewrite copyright law to turn AI into their exclusive property. It’ll be an exception with no justification besides “greed=good”
It’s random slop shat out by a machine. Art requires a living, breathing human with thoughts, emotions, and experiences, otherwise it’s just a pile of shit.
It’s only immoral, not inherently of lower quality. Aesthetics and ethics aren’t about what actually is, but about what should be. Even if an AI and a person produce the same image, the AI isn’t a living, breathing human. AI art isn’t slop because of its content, but because of the economic context. That’s a far better reason to hate it than its mistakes and shortcomings.
AI is a tool. The product can be a random slop if you give it sloppy instructions, or someone can realize this way their great artistic idea that they would not be able to make real otherwise. The pictures don’t just generate themselves, you know? It’s living priple who tell the machine what’s on their minds. If your mind is creative, the results can be good.
AIs take away attribution as well as copyright. The original authors don’t get any credit for their creativity and hard work. That is an entirely separate thing from ownership and property.
It is not at all OK for an AI to take a work that is in the public domain, erase the author’s identity, and then reproduce it for people, claiming it as its own.
Is one of those things giving attribution? If I ask for a picture of Mount Fuji in the style of a woodblock print, can the AI tell me what its inspirations were?
it can tell you its inspiration about as well as photoshop’s content-aware fill, because it’s sort of the same tech, just turned to 11. but it depends.
if a lot of the training data is tagged with the name of the artist, and you use the artist’s name to get that style, and the output looks made by that artist, you would be fairly sure who to attribute. if not, you would have to do a mathematical analysis of the model. that’s because it’s not actually associating text with images, the text part is separate from the image part and they only communicate through a sort of coordinate system. one part sees text, the other sees shapes.
also, the size of the training dataset compared to the size of the finished model means that there is less than one bit stored per full image. the fact that some models can reproduce input images almost exactly is basically luck, because none of the original image is in there. it just pulls together everything it knows to build something that already exists.
Even in a hypothetical utopia, the thought of a sea of slop drowning the creative world makes my skin crawl. Imagine putting your heart and soul into something only to watch some machine liquify it into an ugly paste in a nanosecond, then it goes on to do the same thing a million times in a row. It’s hard enough to get noticed in this world, and now every passion project has to compete with the diseased inbred freak clones of other passion projects? It makes me feel so goddamn angry that some asshole felt the need to invent such a thing, and for what? What problem does it solve? Why do you need to use up a cities worth of water to make a six fingered Sailor Moon?
I generally agree (especially with the current critique of using up water/power just for one image)
But I can’t get behind “this tool will make people who don’t use it feel bad”. The same arguments were levied against Photoshop and now it’s a tool in the arsenal. The same arguments were levied against the camera. And I could see the same argument against the printing press (save those poor monks doing calligraphy)
The goal of “everything shall be AI” is fucked and clearly wrong. That doesn’t mean there isn’t any use for it. People who wanna crank out slop will give up when there’s no money in it and it doesn’t grant them attention.
And I say this as someone who despises how every website has an AI chatbot popping up when I visit their site and every search engine is offloading actually visiting and reading pages to AI summaries
This is where I’m coming from. Generative AI is pretty cool and useful, but it has severe limitations that most people don’t comprehend. Machine learning can automate countless time consuming tasks. This is especially true in the entertainment industry, where it’s just another tool for production to use.
Businesses fail to understand is that it cannot perform deductive tasks without necessarily making errors. It can only give probable outputs, not outputs that must be correct based on the input. It goes against the very assumptions we make about computer logic, as it doesn’t work on deductive reasoning.
Generative AI works by emulating biological intelligence, taking principles of neuroscience to solve problems quickly and efficiently. However, this gives AI similar weaknesses to our own minds, imagining things and baking in bias. It can never give the accurate summaries Google hopes it can, as it will only ever tell us what it thinks we want to hear. They keep misusing it in ways that either waste everyone’s time, or do serious harm.
Im sorry but if your arguments is that “AI is doomed because current LLMs are only good at fuzzy, probabilistic, outcomes”, then you do not understand current AI or computer science or why computer scientists are impressed by modern AI.
Discrete concrete logic is what computers have always been good at. That is easy. What has been difficult, is finding a way for computers to address fuzzy, pattern matching, probabilistic problems. The fact that Neural Networks are good at those is precisely what has Computer Scientists excited about AI.
I’m not saying it’s doomed! I literally said that it’s cool and useful. It’s a revolutionary technology in many respects, but not for everything. It cannot replace the things computers have always been good at, but business people don’t seem to realize that. They assume that it can fix anything, not understanding that it will only make certain things worse. The trade-off is counterproductive for tasks where you need consistent indexing.
For instance, Google’s search AI turns primary sources into secondary or tertiary sources by trying to cut corners. I have zero trust in anything it tries to tell me, while all the problems it had before AI have continued to worsen. They could’ve used machine learning to better understand search queries, or diversify results to compensate for vagueness in language, or to fucking combat SEO, but they instead clog up the results with even more bullshit! It’s a war against curiosity at this point! 😫
The sad thing is there is currently a vibrant open source scene around generative ai. There is a strong media campaign against it, as to manipulate the general population so they clamor for a strengthening of copyrights laws.
This won’t lead to these tools disappearing, it will just force them behind pricey and censored subscription models while open source options wither and die.
They do indeed want to enslave us, and will do it with the help of people like OP.
IP, like every part of capitalism, has been totally turned against the artists it claimed to protect. If they want it to only be a chain that binds us, we need to break it. They had their chance to make it work for workers, and they squashed it. If we can’t buy into the system, we have every reason to oppose it.
On a large scale, this will come in the form of “crime,” not revolutionary action. With no social contract binding anyone voluntarily, people will do what they must to serve their own interests. Any criminal activity that weakens the system more than the people must be supported whole heartedly. Smuggling and theft from the wealthy; true Robin Hood marks; are worthy of support. Vengeance from those scarred by the system is more justice than state justice. Revolution isn’t what the fat cats need to fear.
AI plagiarism wouldn’t be a problem if it weren’t for intellectual copyright and capitalism. Ironically, the status quo of AI art being public domain is absolutely based, as the fruits of our stolen labor belong to us. The communists and anarchists should totally make nonprofit AI art that nobody is allowed to own. Reclaiming AI would be awesome!
Unfortunately, tech bros want to enslave all artists along with the rest of the workers, so they’ll rewrite copyright law to turn AI into their exclusive property. It’ll be an exception with no justification besides “greed=good”
Eleuther AI
How do you continue to be so awesomeand wise? Teach me your ways.
It’s random slop shat out by a machine. Art requires a living, breathing human with thoughts, emotions, and experiences, otherwise it’s just a pile of shit.
It’s only immoral, not inherently of lower quality. Aesthetics and ethics aren’t about what actually is, but about what should be. Even if an AI and a person produce the same image, the AI isn’t a living, breathing human. AI art isn’t slop because of its content, but because of the economic context. That’s a far better reason to hate it than its mistakes and shortcomings.
AI is a tool. The product can be a random slop if you give it sloppy instructions, or someone can realize this way their great artistic idea that they would not be able to make real otherwise. The pictures don’t just generate themselves, you know? It’s living priple who tell the machine what’s on their minds. If your mind is creative, the results can be good.
AIs take away attribution as well as copyright. The original authors don’t get any credit for their creativity and hard work. That is an entirely separate thing from ownership and property.
It is not at all OK for an AI to take a work that is in the public domain, erase the author’s identity, and then reproduce it for people, claiming it as its own.
AI can do much more than “reproduce”.
Is one of those things giving attribution? If I ask for a picture of Mount Fuji in the style of a woodblock print, can the AI tell me what its inspirations were?
it can tell you its inspiration about as well as photoshop’s content-aware fill, because it’s sort of the same tech, just turned to 11. but it depends.
if a lot of the training data is tagged with the name of the artist, and you use the artist’s name to get that style, and the output looks made by that artist, you would be fairly sure who to attribute. if not, you would have to do a mathematical analysis of the model. that’s because it’s not actually associating text with images, the text part is separate from the image part and they only communicate through a sort of coordinate system. one part sees text, the other sees shapes.
also, the size of the training dataset compared to the size of the finished model means that there is less than one bit stored per full image. the fact that some models can reproduce input images almost exactly is basically luck, because none of the original image is in there. it just pulls together everything it knows to build something that already exists.
Even in a hypothetical utopia, the thought of a sea of slop drowning the creative world makes my skin crawl. Imagine putting your heart and soul into something only to watch some machine liquify it into an ugly paste in a nanosecond, then it goes on to do the same thing a million times in a row. It’s hard enough to get noticed in this world, and now every passion project has to compete with the diseased inbred freak clones of other passion projects? It makes me feel so goddamn angry that some asshole felt the need to invent such a thing, and for what? What problem does it solve? Why do you need to use up a cities worth of water to make a six fingered Sailor Moon?
Eh. Without the economic incentive, we wouldn’t be getting a sea of slop. The energy concerns are very real though.
I generally agree (especially with the current critique of using up water/power just for one image)
But I can’t get behind “this tool will make people who don’t use it feel bad”. The same arguments were levied against Photoshop and now it’s a tool in the arsenal. The same arguments were levied against the camera. And I could see the same argument against the printing press (save those poor monks doing calligraphy)
The goal of “everything shall be AI” is fucked and clearly wrong. That doesn’t mean there isn’t any use for it. People who wanna crank out slop will give up when there’s no money in it and it doesn’t grant them attention.
And I say this as someone who despises how every website has an AI chatbot popping up when I visit their site and every search engine is offloading actually visiting and reading pages to AI summaries
This is where I’m coming from. Generative AI is pretty cool and useful, but it has severe limitations that most people don’t comprehend. Machine learning can automate countless time consuming tasks. This is especially true in the entertainment industry, where it’s just another tool for production to use.
Businesses fail to understand is that it cannot perform deductive tasks without necessarily making errors. It can only give probable outputs, not outputs that must be correct based on the input. It goes against the very assumptions we make about computer logic, as it doesn’t work on deductive reasoning.
Generative AI works by emulating biological intelligence, taking principles of neuroscience to solve problems quickly and efficiently. However, this gives AI similar weaknesses to our own minds, imagining things and baking in bias. It can never give the accurate summaries Google hopes it can, as it will only ever tell us what it thinks we want to hear. They keep misusing it in ways that either waste everyone’s time, or do serious harm.
Im sorry but if your arguments is that “AI is doomed because current LLMs are only good at fuzzy, probabilistic, outcomes”, then you do not understand current AI or computer science or why computer scientists are impressed by modern AI.
Discrete concrete logic is what computers have always been good at. That is easy. What has been difficult, is finding a way for computers to address fuzzy, pattern matching, probabilistic problems. The fact that Neural Networks are good at those is precisely what has Computer Scientists excited about AI.
I’m not saying it’s doomed! I literally said that it’s cool and useful. It’s a revolutionary technology in many respects, but not for everything. It cannot replace the things computers have always been good at, but business people don’t seem to realize that. They assume that it can fix anything, not understanding that it will only make certain things worse. The trade-off is counterproductive for tasks where you need consistent indexing.
For instance, Google’s search AI turns primary sources into secondary or tertiary sources by trying to cut corners. I have zero trust in anything it tries to tell me, while all the problems it had before AI have continued to worsen. They could’ve used machine learning to better understand search queries, or diversify results to compensate for vagueness in language, or to fucking combat SEO, but they instead clog up the results with even more bullshit! It’s a war against curiosity at this point! 😫
You sound like my grandparents complaining about techno musicians sampling music instead of playing it themselves.
Good art can be created with any medium. You view AI as replacing art, future musicians will understand it and use it to create art.
Yep this was inevitable.
The sad thing is there is currently a vibrant open source scene around generative ai. There is a strong media campaign against it, as to manipulate the general population so they clamor for a strengthening of copyrights laws.
This won’t lead to these tools disappearing, it will just force them behind pricey and censored subscription models while open source options wither and die.
They do indeed want to enslave us, and will do it with the help of people like OP.
IP, like every part of capitalism, has been totally turned against the artists it claimed to protect. If they want it to only be a chain that binds us, we need to break it. They had their chance to make it work for workers, and they squashed it. If we can’t buy into the system, we have every reason to oppose it.
On a large scale, this will come in the form of “crime,” not revolutionary action. With no social contract binding anyone voluntarily, people will do what they must to serve their own interests. Any criminal activity that weakens the system more than the people must be supported whole heartedly. Smuggling and theft from the wealthy; true Robin Hood marks; are worthy of support. Vengeance from those scarred by the system is more justice than state justice. Revolution isn’t what the fat cats need to fear.
I need someone to train with. You or anyone else in WV?
Western Victoria?
Nah I’m in the US.
We name a lot of shit after the people who used to live here, instead of a fucking monarch.
WV used to live near you? Who’s that?