A simpler way to remove explicit images from Search

(blog.google)

38 points | by gnabgib 10 hours ago ago

29 comments

  • dannyw 7 hours ago ago

    Looks like a nice and well designed improvement that will help people.

    I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.

  • xeyownt 7 hours ago ago

    So you can pinpoint to Google what image is of high (damaging) value and Google show you more of these.

    What could go wrong?

    • dannyw 7 hours ago ago

      Oh wow. Actually, that’s a really good point; I’m not sure how you could counter that (lots of regulations ~do not allow Google to hide reporting/takedown flows etc behind an account).

      Hopefully Google didn’t just build the world’s best deepfake search…

  • nly 5 hours ago ago

    Why limited to sexual images?

    Why can I not control whether any kind of public image of me at all appears?

  • undefined 9 hours ago ago
    [deleted]
  • etchalon 8 hours ago ago

    "Please don't regulate us" step 6,438.

  • vasco 8 hours ago ago

    I don't see how those religious groups that forced card payment processors to ban pornhub et al are not going to abuse this by mass reporting any nude picture they find as their own.

    • BuyMyBitcoins 7 hours ago ago

      >”those religious groups that forced card payment processors to ban pornhub et al”

      I question how much influence such groups actually have, given that payment processors already dislike dealing with adult oriented businesses.

      The percentage of chargebacks and disputes made for those transactions is significantly higher than any other category. Companies hate having to investigate such claims and issue new cards, even when it appears fairly obvious the purchase was made by the cardholder. It’s also tricky from a customer service standpoint, because the cardholder may likely be lying in order to hide an embarrassing purchase from a spouse or other family member.

      It seems like payment processors just want to get rid of a hassle for themselves.

    • undefined 7 hours ago ago
      [deleted]
  • mlindner 8 hours ago ago

    Google practically never shows explicit images to anyone anymore anyway. Even bing doesn't anymore. I feel like we've returned to a more prude society, at least on the mainstream internet.

    • advisedwang 8 hours ago ago

      I don't think it's prudish to want the ability to take down deepfakes of you naked or leaked images of you.

      • Dylan16807 8 hours ago ago

        Your comment has basically no connection to the comment you replied to. (Which itself had a weak connection to the article, but that's a separate issue.)

        • cush 8 hours ago ago

          The article is about removing non-consensual sexually explicit images and deepfakes

          • ReptileMan 6 hours ago ago

            And on whom falls the burden of proof that it is non-consensual or deepfake? We danced similar dance with DMCA takedowns.

        • undefined 8 hours ago ago
          [deleted]
    • cush 8 hours ago ago

      The article mentions they're introducing a new way to request the removal of non-consensual explicit images on Search

      the key bit is non-consensual, so it's unrelated to individual morality and they're providing a way to report a real crime

      • robocat 8 hours ago ago

          Please don't comment on whether someone read an article.
        
        https://news.ycombinator.com/newsguidelines.html
        • selcuka 7 hours ago ago

          The GP comment is in compliance with the guideline:

          > Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".

          "You should really read the article" is semantically the same as "The article mentions that". It's not a question.

        • cush 7 hours ago ago

          k

    • efilife 8 hours ago ago

      Even duckduckgo started censoring their results! Although very subtly. My friend (really) showed me an explicit search query with the safe search turned off that I then compared with yandex, and was really surprised how different they were. Nothing explicit on DDG, even though it included the word "hentai".

      (I am aware this is not really related to the article. I think this is a cool discussion to be had)

  • guessmyname 8 hours ago ago

    Why is Google indexing these harmful images in the first place?

    Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.

    Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.

    • Dylan16807 8 hours ago ago

      How is an image model supposed to detect if there was consent to share the picture?

      If you're saying they shouldn't index any explicit images, you're talking about something very different from the article.

      • drdaeman 8 hours ago ago

        I think that “one by one” part allows different interpretations of what guessmyname possibly meant.

        But I fail to make sense of it either way. Either the nuance of lack of consent is missing, or Google is blamed for not doing what they just did from the very first version.

    • whatevermom5 8 hours ago ago

      They probably make money showing pork search results