The White House released a statement today outlining commitments from several AI companies to combat image-based sexual abuse by preventing the creation and distribution of non-consensual intimate images (NCII) and child sexual abuse material (CSAM). Adobe, Anthropic, Cohere, Common Crawl, Microsoft, and OpenAI have pledged to incorporate feedback loops, stress-testing strategies, and remove nude images from AI training datasets. While this is a voluntary commitment without new actionable steps or consequences, it is a positive effort towards tackling a serious problem. Notably absent from this initiative are Apple, Amazon, Google, and Meta. Additionally, many big tech and AI companies have partnered to stop the spread of deepfake images and videos through separate efforts. Victims of NCII can report incidents to StopNCII, while those under 18 can file reports with NCMEC.