Google was discovered using funds to promote
AI applications that produce nonconsensual nudes.
17/8/2024,
According to fresh information from 404 Media, Google is accepting payments to promote AI apps that generate nonconsensual deepfake nudes.
Paid adverts for websites offering services like “NSFW AI Image Generator” and other easily accessible AI tools that can be used to create explicit imagery of actual individuals without their knowledge are displayed when searches for terms like “undress apps” and “best deepfake nudes” are made, as detected by 404.
Since they have always been absurdly simple to locate on the search giant’s platform one search query and one click away Google has come under fire for failing to stop the spread of AI deepfakes of actual people in its search results.
Just last week, Google announced that it will be broadening its search policy to “assist people impacted” by the “non-consensual sexually explicit fake content” that appears on its search results pages in response to this criticism.
However, as 404’s research shows, Google‘s deepfake issue extends to its ad business, since the search engine behemoth is actively making money from sponsored posts promoting some of the same AI services that enable unscrupulous actors to create intrusive and nonconsensual sexual content in the first place.
Journalists with 404 claim that Google has eliminated particular advertisements and webpages from its database. Companies that seek “to create synthetic sexual or nude content are prohibited from advertising through any of our platforms or generating revenue through Google Ads,” a Google representative told the site.
A representative for 404 went on to say that the search engine behemoth is “actively investigating this issue and will permanently suspend advertisers who violate our policy, removing all of their ads from our platforms.” However, 404 claims that the spokesperson failed to answer the fundamental question of why Google allows advertisers to pay for the promotion of links for search phrases like “undress app” in the first place.
Deeply faked porn hurts the people who watch it. Furthermore, a growing number of alarming incidents of nonconsensual fake nudes are being created, especially in middle and high schools, where school systems and law enforcement are finding it difficult to regulate the practice.
These cases are being reported on due to the public’s easy access to AI deepfake tools.
Not to mention the searchability of the tools made to create deepfakes, Google, the website most people use to find stuff, obviously has a long way to go in its efforts to stem the swelling tide of deepfakes on the web.
According to fresh information from 404 Media, Google is accepting payments to promote AI apps that generate nonconsensual deepfake nudes.
Discover more from
Subscribe to get the latest posts sent to your email.