• Zagaroth@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Normally I err on the side of ‘art’ being separated from actual pictures/recordings of abuse. It falls under the “I don’t like what you have to say, but I will defend your right to say it” idea.

    Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

    • artaxadepressedhorse@lemmyngs.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I keep seeing people post this same idea, and I see no proof that it would actually happen.

      Why would you need “real” CP if there’s like-for-like-quality AI CP out there?

      Also, aside from going out of our way to wreck the lives of individuals who look at the stuff, is there any actual concrete stats that say we’re preventing any sort of significant number of RL child abuse by giving up rights to privacy or paying FBI agents to post CP online and entrap people? I Don’t get behind the “if it theoretically helped one single child, I’d genocide a nation…” bs. I want to see what we’ve gained so far by these policies before I agree to giving govt more power by expanding them.

      • Radiant_sir_radiant@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        This is a difficult one to get morally ‘right’, but I can see how legal (or at least not-all-out-illegal) AI CP could make the situation worse. Given today’s technological advances, it will be next to impossible for law enforcement to reliably distinguish between illegal real CP and not-illegal artificial CP, meaning images and videos of actual child abuse cannot be used as evidence in court anymore, as the defendant can just claim that it’s AI-generated.
        Second, while a lot of consumers of CP might be happy with AI material, I expect that for a substantial number, the real thing will be considered superior or a special treat… much as many consumers of ‘normal’ porn prefer amateur porn over mass-produced studio flicks.
        The two combined would mean there’s still a considerable market for real CP, but the prosecution of child abusers would be much, much harder.

        • Krauerking@lemy.lol
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          See the biggest issue is that there isn’t an easy way to test any hypothesis here. For a pretty big obvious issue if you look at it.

          You get wrong building a battery you maybe burn a building down, you get it wrong trying to cure pedophilia, you end up with a molested or hurt kid at worst. And a lot more people are gonna have strong emotions about the child than a building even if more lives are lost in the fire.

          It’s such a big emotionally charged thing to get wrong. How do you agree to take the risk when no one would feel comfortable with the worst outcome?

          So instead it’s easy and potentially just proper to push it aside and blanket say “bad”. And I hate black or white issues. But it’s impossible to answer without doing and impossible to do without and answer.

          • Radiant_sir_radiant@beehaw.org
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            See the biggest issue is that there isn’t an easy way to test any hypothesis here.

            If I had to speculate I could see both turning out to be true. There are probably some pedophiles whom AI CP will help handle the urge, and some for whom the readily available content will make actual abuse more morally acceptable. But then again, we’ll probably never know for sure unless we find some criteria like in your nice battery example. Criteria such as “is the building on fire” give you quick and near-immediate feedback on whether or not you’ve been successful.

            The discussion reminds me of the never-ending debate on whether drugs should be legal though. If there should be tests with AI CP, could there be a setup similar to that of supplying recovering heroine addicts (and only them) with methadone? This would allow the tests to be conducted in a controlled environment, with a control group and according to reproducible criteria.

    • interolivary@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

      While I absolutely don’t want to sound like I’m defending the practice (because I’m not), I’m really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is “no”, then why is child porn different? If “yes”, then should we declare all the other ones illegal as well?

      It’s not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone’s life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn’t mean it’s a very good justification for making it illegal if there is no victim.

      The reason why I’m not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing “murder simulators” which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn’t the case.

      I’d even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it’s a relatively easier and safer way for them to satisfy an urge. I wouldn’t be willing to bet on that though