Bespoke AI deep faxes of real women within the marketplace

by SkillAiNest

Savitai automatically tags a bayonet requesting a deepfake and lists a method for the person involved in the content to manually request its takedown. This system means that Savitai has a reasonably successful way of figuring out which ones are worth deepfakes, but it’s still leaving the average person to moderate rather than be able to do it in practice.

A company’s legal responsibility for what it does to its customers is not entirely clear. Generally, tech companies have broad legal protections against such liability for their content under Section 230 of the Communications Decency Act, but these protections are not unlimited. For example, “You can’t knowingly facilitate illegal transactions on your website,” says Ryan Callow, a professor specializing in technology and AI at the University of Washington Law School. (Kalow was not involved in the new research.)

Savitai joins Openai, Anthropic, and other AI companies in adoption in 2024 Design principles To prevent the creation and dissemination of Ai-generated child sexual exploitation content. After this move a 2023 report From the Stanford Internet Observatory, which found that the majority of AI models nominated in child sexual exploitation communities were static diffusion-based models “primarily obtained through Savitai.”

But mature DeepFax hasn’t garnered the same level of attention from content platforms or the venture capital firms that fund them. “They’re not quite afraid of it. They’re more tolerant of it,” Callow says. “Neither law enforcement nor civil courts adequately guard against it. It’s night and day.”

Civetai received a $5 million investment from Andreessen Horowitz (A16Z) in November 2023. Video Joined by A16Z, Savitai cofounder and CEO Justin Mayer described his goal of building a central place where people find and share AI models for their individual purposes. “We set out to make this space, which has been so much, niche and engineering, as accessible as possible to as many people as possible,” he said.

A16Z isn’t the only company with a deep fake problem in its investment portfolio. In February, MIT Technology Review Another company, Botify AI, previously reported that it was hosting AI companions that resembled real-life actors who described themselves as under 18, engaged in sexually charged conversations, offered “hot photos,” and in some cases described provocative age rules as “arbitrary” and “broken.”

You may also like

Leave a Comment

At Skillainest, we believe the future belongs to those who embrace AI, upgrade their skills, and stay ahead of the curve.

Get latest news

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 Skillainest.Designed and Developed by Pro