• TotallynotJessica@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    88
    arrow-down
    9
    ·
    edit-2
    1 day ago

    AI plagiarism wouldn’t be a problem if it weren’t for intellectual copyright and capitalism. Ironically, the status quo of AI art being public domain is absolutely based, as the fruits of our stolen labor belong to us. The communists and anarchists should totally make nonprofit AI art that nobody is allowed to own. Reclaiming AI would be awesome!

    Unfortunately, tech bros want to enslave all artists along with the rest of the workers, so they’ll rewrite copyright law to turn AI into their exclusive property. It’ll be an exception with no justification besides “greed=good”

    • hungryphrog@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      4 hours ago

      It’s random slop shat out by a machine. Art requires a living, breathing human with thoughts, emotions, and experiences, otherwise it’s just a pile of shit.

      • TotallynotJessica@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 hours ago

        It’s only immoral, not inherently of lower quality. Aesthetics and ethics aren’t about what actually is, but about what should be. Even if an AI and a person produce the same image, the AI isn’t a living, breathing human. AI art isn’t slop because of its content, but because of the economic context. That’s a far better reason to hate it than its mistakes and shortcomings.

      • angrystego@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        3 hours ago

        AI is a tool. The product can be a random slop if you give it sloppy instructions, or someone can realize this way their great artistic idea that they would not be able to make real otherwise. The pictures don’t just generate themselves, you know? It’s living priple who tell the machine what’s on their minds. If your mind is creative, the results can be good.

    • mindlesscrollyparrot@discuss.tchncs.de
      link
      fedilink
      arrow-up
      31
      arrow-down
      5
      ·
      1 day ago

      AIs take away attribution as well as copyright. The original authors don’t get any credit for their creativity and hard work. That is an entirely separate thing from ownership and property.

      It is not at all OK for an AI to take a work that is in the public domain, erase the author’s identity, and then reproduce it for people, claiming it as its own.

        • mindlesscrollyparrot@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          2 hours ago

          Is one of those things giving attribution? If I ask for a picture of Mount Fuji in the style of a woodblock print, can the AI tell me what its inspirations were?

          • lime!@feddit.nu
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            it can tell you its inspiration about as well as photoshop’s content-aware fill, because it’s sort of the same tech, just turned to 11. but it depends.

            if a lot of the training data is tagged with the name of the artist, and you use the artist’s name to get that style, and the output looks made by that artist, you would be fairly sure who to attribute. if not, you would have to do a mathematical analysis of the model. that’s because it’s not actually associating text with images, the text part is separate from the image part and they only communicate through a sort of coordinate system. one part sees text, the other sees shapes.

            also, the size of the training dataset compared to the size of the finished model means that there is less than one bit stored per full image. the fact that some models can reproduce input images almost exactly is basically luck, because none of the original image is in there. it just pulls together everything it knows to build something that already exists.

    • Jomega@lemmy.worldOP
      link
      fedilink
      arrow-up
      54
      arrow-down
      18
      ·
      1 day ago

      Even in a hypothetical utopia, the thought of a sea of slop drowning the creative world makes my skin crawl. Imagine putting your heart and soul into something only to watch some machine liquify it into an ugly paste in a nanosecond, then it goes on to do the same thing a million times in a row. It’s hard enough to get noticed in this world, and now every passion project has to compete with the diseased inbred freak clones of other passion projects? It makes me feel so goddamn angry that some asshole felt the need to invent such a thing, and for what? What problem does it solve? Why do you need to use up a cities worth of water to make a six fingered Sailor Moon?

      • darthelmet@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        1 day ago

        Eh. Without the economic incentive, we wouldn’t be getting a sea of slop. The energy concerns are very real though.

      • nectar@lemmy.world
        link
        fedilink
        arrow-up
        31
        arrow-down
        2
        ·
        1 day ago

        I generally agree (especially with the current critique of using up water/power just for one image)

        But I can’t get behind “this tool will make people who don’t use it feel bad”. The same arguments were levied against Photoshop and now it’s a tool in the arsenal. The same arguments were levied against the camera. And I could see the same argument against the printing press (save those poor monks doing calligraphy)

        The goal of “everything shall be AI” is fucked and clearly wrong. That doesn’t mean there isn’t any use for it. People who wanna crank out slop will give up when there’s no money in it and it doesn’t grant them attention.

        And I say this as someone who despises how every website has an AI chatbot popping up when I visit their site and every search engine is offloading actually visiting and reading pages to AI summaries

        • TotallynotJessica@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          1
          ·
          1 day ago

          This is where I’m coming from. Generative AI is pretty cool and useful, but it has severe limitations that most people don’t comprehend. Machine learning can automate countless time consuming tasks. This is especially true in the entertainment industry, where it’s just another tool for production to use.

          Businesses fail to understand is that it cannot perform deductive tasks without necessarily making errors. It can only give probable outputs, not outputs that must be correct based on the input. It goes against the very assumptions we make about computer logic, as it doesn’t work on deductive reasoning.

          Generative AI works by emulating biological intelligence, taking principles of neuroscience to solve problems quickly and efficiently. However, this gives AI similar weaknesses to our own minds, imagining things and baking in bias. It can never give the accurate summaries Google hopes it can, as it will only ever tell us what it thinks we want to hear. They keep misusing it in ways that either waste everyone’s time, or do serious harm.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            8
            ·
            edit-2
            1 day ago

            Im sorry but if your arguments is that “AI is doomed because current LLMs are only good at fuzzy, probabilistic, outcomes”, then you do not understand current AI or computer science or why computer scientists are impressed by modern AI.

            Discrete concrete logic is what computers have always been good at. That is easy. What has been difficult, is finding a way for computers to address fuzzy, pattern matching, probabilistic problems. The fact that Neural Networks are good at those is precisely what has Computer Scientists excited about AI.

            • TotallynotJessica@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              1
              ·
              1 day ago

              I’m not saying it’s doomed! I literally said that it’s cool and useful. It’s a revolutionary technology in many respects, but not for everything. It cannot replace the things computers have always been good at, but business people don’t seem to realize that. They assume that it can fix anything, not understanding that it will only make certain things worse. The trade-off is counterproductive for tasks where you need consistent indexing.

              For instance, Google’s search AI turns primary sources into secondary or tertiary sources by trying to cut corners. I have zero trust in anything it tries to tell me, while all the problems it had before AI have continued to worsen. They could’ve used machine learning to better understand search queries, or diversify results to compensate for vagueness in language, or to fucking combat SEO, but they instead clog up the results with even more bullshit! It’s a war against curiosity at this point! 😫

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        12
        ·
        edit-2
        1 day ago

        You sound like my grandparents complaining about techno musicians sampling music instead of playing it themselves.

        Good art can be created with any medium. You view AI as replacing art, future musicians will understand it and use it to create art.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      3
      ·
      edit-2
      1 day ago

      The sad thing is there is currently a vibrant open source scene around generative ai. There is a strong media campaign against it, as to manipulate the general population so they clamor for a strengthening of copyrights laws.

      This won’t lead to these tools disappearing, it will just force them behind pricey and censored subscription models while open source options wither and die.

      They do indeed want to enslave us, and will do it with the help of people like OP.

      • TotallynotJessica@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 day ago

        IP, like every part of capitalism, has been totally turned against the artists it claimed to protect. If they want it to only be a chain that binds us, we need to break it. They had their chance to make it work for workers, and they squashed it. If we can’t buy into the system, we have every reason to oppose it.

        On a large scale, this will come in the form of “crime,” not revolutionary action. With no social contract binding anyone voluntarily, people will do what they must to serve their own interests. Any criminal activity that weakens the system more than the people must be supported whole heartedly. Smuggling and theft from the wealthy; true Robin Hood marks; are worthy of support. Vengeance from those scarred by the system is more justice than state justice. Revolution isn’t what the fat cats need to fear.