Google is sending a stern message to advertisers: Do not affiliate with the dark side of AI.
The search engine is cracking down on the promotion of deepfake pornography, the act of using artificial intelligence to modify or alter a person’s face or body to depict sexual scenes.
The tech titan is giving advertisers a tight deadline to dump all business with sites involved with the unethical tactic.
A new ad policy released by Google targets “sites or apps that claim to generate deepfake pornography, instructions on how to create deepfake pornography, endorsing or comparing deepfake pornography services.”
“This update is to explicitly prohibit advertisements for services that offer to create deepfake pornography or synthetic nude content,” Google spokesperson Michael Aciman told The Verge.
Ads within that “scope” will be banned starting May 30, according to the updated policy.
“We take violations of this policy very seriously and consider them egregious,” it reads.
“If we find violations of this policy, we will suspend your Google Ads accounts upon detection and without prior warning, and you will not be allowed to advertise with us again.”
Last year, Google removed nearly 2 billion ads for violating sexual content policies.
Problems with deepfakes are running rampant online with the issue also also impacting teens.
Los Angeles schools are warning parents that deepfakes of students are circulating throughout California at locations like Beverly Hills High School.
Celebrities like Taylor Swift, Jenna Ortega, and Congresswoman Alexandria Ocasio-Cortez have also been targeted by deepfakes.
Source link
#Google #bans #deepfakeporn #ads #egregious #nudes #surge