28 comments

  • echoangle a day ago ago

    Is there an actual case for outlawing this that isn't based on moral panic? Wouldn't you actually want people to generate those images with AI so they are less incentivized to pay for the real stuff?

    As long as you don't need actual CSAM material in the training data and the generated images are different enough from a real person (both of which seem to be very possible technology-wise), that seems to be a good thing.

    Or is there any indication that availability of CSAM material actually increases the likelihood that people act on it later?

    • _aavaa_ a day ago ago

      We don't have (and I doubt we will ever have) tools for distinguishing between real and ai generated images with a guaranteed 100% accuracy (and 0% false negative and false positive rates).

      Given that, I don't see how you can allow ai generated CSAM without effectively making "real" csam images be unprosectable.

      • echoangle a day ago ago

        So you think that currently, until this law is implemented, CSAM is effectively unprosecutable because people can just claim they generated the image with AI?

        • _aavaa_ a day ago ago

          I think that there is a >0% probability that a individual case can be unprosecutable (or at least have the image evidence be much less useful) if the person in question actively starts generating CSAM using AI for the purpose of casting doubt on the legitimacy of any individual real image that the prosecutor wants to use as evidence.

          The standard is beyond reasonable doubt, and I think that's going to become an increasingly difficult bar to clear if the AI generated versions (either made for their own case or as decoys) are allowed to remain legal.

        • polski-g 12 hours ago ago

          Well they could ask the child in the photo...

      • CaptainFever a day ago ago

        You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.

        (You need to sign both the models and the programs to make sure there's no img2img.)

        • lm28469 a day ago ago

          The level of tech solutionist brain rot you need to reach to propose state sponsored child porn generator... This forum is a parody of itself

          • PathfinderBot 20 hours ago ago

            So no real arguments against it, only insults. That's great news, thank you! :)

            • lm28469 10 hours ago ago

              You don't need scientific arguments for everything you know. What's your argument against consentent 10 year old siblings having sex together if they use protection? I don't have one but I know it's morally wrong and won't bring anything good

        • echoangle a day ago ago

          You don’t even need to give them a model, just generate some images and publish them. If you find those images, it’s fine, if you find anything else, arrest them.

          • PathfinderBot 20 hours ago ago

            That works too, though it'll of course result in a smaller selection and therefore smaller impact on the real market.

        • _aavaa_ a day ago ago

          We can't agree on weed or safe injection sites, you think we'll have government approved CP generation?

          • PathfinderBot 20 hours ago ago

            I totally agree, we should aim for all three harm reduction measures.

    • AlecSchueler 13 hours ago ago

      > Wouldn't you actually want people to generate those images with AI so they are less incentivized to pay for the real stuff?

      You got a study showing that's how it works!

      > Or is there any indication that availability of CSAM material actually increases the likelihood that people act on it later?

      There's a decent amount of research that shows the escalating nature of pornography consumption.

      https://fightthenewdrug.org/how-porn-can-become-an-escalatin...

      You can follow the references in this article.

      Surely it's better to ask why people are looking for any kind of child abuse material, generated or not, and find ways to help them.

      As much as it's reasonable to worry about moral panic one might also worry about moral complacency.

      • echoangle 11 hours ago ago

        > You got a study showing that's how it works!

        Assuming that’s a question, no I don’t.

        > There's a decent amount of research that shows the escalating nature of pornography consumption.

        Watching more extreme porn doesn’t mean you actually do it in real life later though.

        > Surely it's better to ask why people are looking for any kind of child abuse material, generated or not, and find ways to help them.

        Wouldn’t it be best to do both? And also, is there a known way to „help“ pedophiles? Isn’t it like wanting to „help“ homosexual people by converting them to being heterosexual? They are attracted to what they’re attracted to, is there any chance of changing that?

    • dyauspitr a day ago ago

      I completely agree. Most people were marrying 13 and 14 year olds less than 100 years ago. Yes we want to make people only be sexually attractive after they turn 18 but that’s not reality.

      That being said I don’t know if the availability of CSAM would increase or decrease real world abuse.

      • AlecSchueler 13 hours ago ago

        > Most people were marrying 13 and 14 year olds less than 100 years ago.

        Not on the planet earth. What are you basing this claim on?

  • upmind a day ago ago

    We really need it possible to push laws faster, 2026 is going to be an insane year for multimodal models and laws are simply not keeping up.

  • nicbou 13 hours ago ago

    I always found the issue of artificial CSAM fascinating.

    It's repulsive content, but entirely synthetic. Or it it? How do they get the subject right? Where do they draw their inspiration from?

    What does having communities form around such content mean? What harm does normalising the sexualization of children do? Does it suppress an urge or encourage it?

    I struggle to see it as a victimless crime. It might not directly harm children, but the idea of a "scene" for this sort of content is disturbing.

    On the other hand, my entire line of reasoning is the same people used to try to ban violent video games. It's the "gateway drug" to the really bad thing. It feels like the same debate with an ickier subject.

    • Grimblewald 11 hours ago ago

      Try to imagine how nuanced you'd feel it is when someone takes an image of your child and distributes AI csam in their likeness.

      Imagine how nuanced you'd feel the topic is when real csam can hide in broad daylight because everyone assumes it's fake/detection of real vs fake becomes impossible with existing tech.

      This is a hill i'll die on, violently. There's no real nuance here, the damage it causes is far to extreme to even entertain the thought beyond a few thought experiments to confirm there is no case this is ever acceptable.

  • SilverElfin a day ago ago

    I don’t understand why it needs to be banned. If it is artificial, whether it is a story someone wrote, or an animation someone drew, or a photo-realistic AI generated thing, it’s just not real. There is no harm committed to a victim. It feels like this is a moralistic crusade, adjacent to age verification laws that are just backdoor porn bans (freely admitted by the conservatives who support each laws).

    The bigger issue is that these types of bans feel a lot more like banning speech than banning a real crime, and the precedent it sets can end up being used in far-reaching ways. That’s how it always is.

    • Jigsy a day ago ago

      I can't agree with the photorealistic AI images because they're indistinguishable from an actual photograph.

      Everything else I do agree with you on, though.

      The probem is, prosecutions are just looking for easier ways to jail people for things they could do based on what they personally believe. (E.g. "Manga causes child abuse")

      • bitwize a day ago ago

        The United States already considers artwork that resembles a real minor to be outside the First Amendment and hence illegal. Even like, cartoon artwork. If you're fapping to naked Bart Simpson that's one thing, but if it's a drawing of a real child you are using that child's image as a sexual object, that can be profoundly traumatizing, and it is seen to cross the threshold of "actually abusing a child" that justifies not applying the First Amendment. People's likenesses in general are subject to strong protection in the United States and you can face strong penalties for misusing them, even if porn is not involved; consider White v. Samsung.

        • Jigsy a day ago ago

          > but if it's a drawing of a real child

          How would you even prove that, though?

    • like_any_other a day ago ago

      > If it is artificial, whether it is a story someone wrote

      Already illegal in Australia: https://www.independent.co.uk/news/world/australasia/sydney-... (don't hold your breath on it making any "banned books" lists)

      People laughed at Indians believing photos stole one's soul, and now we have legislated even stupider behavior, without the excuse of ignorance.

      • Jigsy a day ago ago

        Australia also believe that women with small breasts in porn causes people to become child abusers...

    • Stevvo a day ago ago

      Datasets such as LAION-5B are found to contain thousands of images of CSAM. So, real victims are involved indirectly.

  • oldestofsports a day ago ago

    We should really up the game and completely ban all AI generated images depicting people, because we have no good way of knowing whether an image is AI generated or real, and images depicting people have terrible consequences in society when weaponized.