US investigator is using AI to detect abuse photos with children made by AI

by SkillAiNest

FilingPosted on September 19, heavy -filled and hugs Coofer and CEO Kevin Goo said MIT Technology Review That he could not discuss the details of the contract, but it has been confirmed that the use of the company’s AI detection algorithm for children’s sexual abuse content (CSAM).

With reference to the filing Data The National Center for Missing and exploiting children, who reported an increase of 1,325 % in Generative AI cases in 2024.

The first priority for children’s exploitation investigators is to find and prevent any kind of abuse at present, but AI generated CSAM floods have made it difficult for investigators to know if a real affected person is at risk. A device that can successfully flag real victims when they try to prioritize cases, they will provide mass support.

The filing states that “identifying AI-generation images” ensures that investigative resources are focused on real victims, maximize the impact of the program and protect the weak. “

The Hua AIA offers iIs tools that produce videos and photos, as well as offer moderate tools of several content that can flag violence, spam and sexual substances and even identify celebrities. In December, MIT Technology Review It is reported that the company is selling its Deep Fix detected technology to the US Army.

You may also like

Leave a Comment

At Skillainest, we believe the future belongs to those who embrace AI, upgrade their skills, and stay ahead of the curve.

Get latest news

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 Skillainest.Designed and Developed by Pro