Looks like a nice and well designed improvement that will help people.
I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.
Oh wow. Actually, that’s a really good point; I’m not sure how you could counter that (lots of regulations ~do not allow Google to hide reporting/takedown flows etc behind an account).
Hopefully Google didn’t just build the world’s best deepfake search…
I don't see how those religious groups that forced card payment processors to ban pornhub et al are not going to abuse this by mass reporting any nude picture they find as their own.
>”those religious groups that forced card payment processors to ban pornhub et al”
I question how much influence such groups actually have, given that payment processors already dislike dealing with adult oriented businesses.
The percentage of chargebacks and disputes made for those transactions is significantly higher than any other category. Companies hate having to investigate such claims and issue new cards, even when it appears fairly obvious the purchase was made by the cardholder. It’s also tricky from a customer service standpoint, because the cardholder may likely be lying in order to hide an embarrassing purchase from a spouse or other family member.
It seems like payment processors just want to get rid of a hassle for themselves.
Google practically never shows explicit images to anyone anymore anyway. Even bing doesn't anymore. I feel like we've returned to a more prude society, at least on the mainstream internet.
Your comment has basically no connection to the comment you replied to. (Which itself had a weak connection to the article, but that's a separate issue.)
The GP comment is in compliance with the guideline:
> Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".
"You should really read the article" is semantically the same as "The article mentions that". It's not a question.
Even duckduckgo started censoring their results! Although very subtly. My friend (really) showed me an explicit search query with the safe search turned off that I then compared with yandex, and was really surprised how different they were. Nothing explicit on DDG, even though it included the word "hentai".
(I am aware this is not really related to the article. I think this is a cool discussion to be had)
Why is Google indexing these harmful images in the first place?
Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.
Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.
I think that “one by one” part allows different interpretations of what guessmyname possibly meant.
But I fail to make sense of it either way. Either the nuance of lack of consent is missing, or Google is blamed for not doing what they just did from the very first version.
Looks like a nice and well designed improvement that will help people.
I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.
So you can pinpoint to Google what image is of high (damaging) value and Google show you more of these.
What could go wrong?
Oh wow. Actually, that’s a really good point; I’m not sure how you could counter that (lots of regulations ~do not allow Google to hide reporting/takedown flows etc behind an account).
Hopefully Google didn’t just build the world’s best deepfake search…
Why limited to sexual images?
Why can I not control whether any kind of public image of me at all appears?
"Please don't regulate us" step 6,438.
I don't see how those religious groups that forced card payment processors to ban pornhub et al are not going to abuse this by mass reporting any nude picture they find as their own.
>”those religious groups that forced card payment processors to ban pornhub et al”
I question how much influence such groups actually have, given that payment processors already dislike dealing with adult oriented businesses.
The percentage of chargebacks and disputes made for those transactions is significantly higher than any other category. Companies hate having to investigate such claims and issue new cards, even when it appears fairly obvious the purchase was made by the cardholder. It’s also tricky from a customer service standpoint, because the cardholder may likely be lying in order to hide an embarrassing purchase from a spouse or other family member.
It seems like payment processors just want to get rid of a hassle for themselves.
Well the religious groups certainly take the credit for themselves and continue their quest, latest was Steam.
https://www.collectiveshout.org/progress_in_global_campaign_...
Google practically never shows explicit images to anyone anymore anyway. Even bing doesn't anymore. I feel like we've returned to a more prude society, at least on the mainstream internet.
I don't think it's prudish to want the ability to take down deepfakes of you naked or leaked images of you.
Your comment has basically no connection to the comment you replied to. (Which itself had a weak connection to the article, but that's a separate issue.)
The article is about removing non-consensual sexually explicit images and deepfakes
And on whom falls the burden of proof that it is non-consensual or deepfake? We danced similar dance with DMCA takedowns.
The article mentions they're introducing a new way to request the removal of non-consensual explicit images on Search
the key bit is non-consensual, so it's unrelated to individual morality and they're providing a way to report a real crime
The GP comment is in compliance with the guideline:
> Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".
"You should really read the article" is semantically the same as "The article mentions that". It's not a question.
k
Even duckduckgo started censoring their results! Although very subtly. My friend (really) showed me an explicit search query with the safe search turned off that I then compared with yandex, and was really surprised how different they were. Nothing explicit on DDG, even though it included the word "hentai".
(I am aware this is not really related to the article. I think this is a cool discussion to be had)
Why is Google indexing these harmful images in the first place?
Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.
Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.
How is an image model supposed to detect if there was consent to share the picture?
If you're saying they shouldn't index any explicit images, you're talking about something very different from the article.
I think that “one by one” part allows different interpretations of what guessmyname possibly meant.
But I fail to make sense of it either way. Either the nuance of lack of consent is missing, or Google is blamed for not doing what they just did from the very first version.
They probably make money showing pork search results
Porkin' across America! https://www.youtube.com/playlist?list=PL4NL9i-Fu15jdlr2KQf_l...
That sounds haram.
Filthy pork addicts...
Oink oink