Google has rolled out updates for Search with the goal of minimizing the visibility of deepfakes, making it more difficult to find explicit manipulated images. In its ongoing efforts against realistic-looking fake imagery, the company is enhancing the process for individuals to have non-consensual fake images featuring them removed from Search. Users have always had the option to request the removal of such images, but now Google will also filter explicit results for similar searches about the individual once a removal request has been granted. Furthermore, Google has adjusted its ranking systems so that searches for explicit deepfakes with a person’s name will result in showcasing “high-quality, non-explicit content” instead, such as news articles about the individual. Google also plans to educate users searching for deepfakes by presenting results that discuss their societal impact. The company acknowledges the need to distinguish legitimate content, like an actor’s nude scene, from fake explicit images, and one method it’s employing is to demote sites that have received many removal requests for manipulated images in Search. This strategy, according to Google, signals that the site is of low quality, aligning with successful approaches used for other harmful content in the past.