15.1 C
New Delhi
Wednesday, December 25, 2024

Google Takes Action Against Inappropriate Deepfakes

More from Author

In Short:

Google has removed fake explicit images of Jennifer Aniston using new search result ranking adjustments. These changes aim to reduce exposure to AI-generated explicit content by over 70%. The company now promotes articles warning about deepfake dangers instead of explicit images. Google will apply measures to reduce the discoverability of unwanted explicit images, but some feel more action is needed. These efforts are meant to give people peace of mind about their online privacy.


Google Search Results Adjusted to Combat Deepfake Images

A few weeks ago, a Google search for “deepfake nudes jennifer aniston” displayed several results claiming to have explicit, AI-generated images of the actress. However, these results have now disappeared.

Reduction in Exposure to Fake Explicit Images

According to Google product manager Emma Higham, recent adjustments in how the company ranks search results have considerably decreased the visibility of fake explicit images. Searches targeting specific individuals for explicit content have seen a decline of over 70%. Google’s algorithms now prioritize news articles and non-explicit content over problematic results.

Focus on News Articles and Non-Explicit Content

Rather than displaying non-consensual fake images, searches such as the one involving Jennifer Aniston now return articles like “How Taylor Swift’s Deepfake AI Porn Represents a Threat” and references to warnings about “deepfake celebrity-endorsement scams”.

Enhanced Measures Against Non-Consensual Explicit Imagery

In response to growing concerns, Google is implementing stricter measures to combat the proliferation of non-consensual explicit images online. These efforts involve removing duplicates of sexualized deepfakes, filtering explicit images in related searches, and demoting websites with high takedown request volumes in search results.

Peace of Mind for Users

Higham emphasized that these actions aim to provide users with added peace of mind, particularly for those worried about similar content resurfacing in the future.

Further Challenges and Potential Solutions

While Google acknowledges that these measures are not foolproof, former employees and advocates for victims suggest that more comprehensive steps could be taken. Despite warnings about illegal content involving children, similar alerts do not appear for searches related to adult sexual deepfakes.

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

- Advertisement -spot_img

Latest article