Microsoft has partnered with StopNCII to help remove non-consensual intimate images, including deepfakes, from its Bing search engine. Victims can open a “case” with StopNCII, which creates a digital fingerprint of an intimate image without needing to upload the file. Several tech companies, including Meta, TikTok, Bumble, Reddit, and more, have joined the effort to scrub unauthorized images. Google, however, is notably absent from this list, despite having its own tools for reporting non-consensual images. Efforts by the US government, such as the US Copyright Office calling for new legislation and the introduction of the NO FAKES Act, aim to address the harms of deepfakes. Victims of non-consensual intimate image-sharing can open a case with StopNCII and Google or file a report with NCMEC if they are under 18.