OpenAI Sora is limiting depictions of real people and taking other strict safety measures to prevent misuse.
The video generator, which was announced on Monday as part of its 12 Days of OpenAI event, has all sorts of editing capabilities for users to create and customize AI-generated videos. But there are certain things you aren't allowed to do with Sora, as users soon discovered.
Tweet may have been deleted
According to its system card, "the ability to upload images of people will be made available to a subset of users," meaning most users can't create videos of people based on an uploaded image. Those users are part of a "Likeness pilot" that OpenAI is testing with a select few. An OpenAI spokesperson said AI-generated videos of people is limited in order to "address concerns around misappropriation of likeness and deepfakes." OpenAI "will actively monitor patterns of misuse, and when we find it we will remove the content, take appropriate action with users, and use these early learnings to iterate on our approach to safety," the spokesperson continued.
Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
SEE ALSO: OpenAI's Sora first look: YouTuber Marques Brownlee breaks down the problems with the AI video model
Limiting the depiction of people in Sora videos makes sense from a liability standpoint. There are all sorts of ways the tool could be misused: non-consensual deepfakes, the depiction of minors, scams, and misinformation to name a few. To combat this, Sora has been trained to reject certain requests from text prompts or image uploads.
It will reject prompts for NSFW (Not Safe For Work) and NCII (Non-Consensual Intimate Imagery) content and the generation of realistic children, although fictitious images are allowed. OpenAI has added C2PA metadata to all Sora videos and made a visible watermark the default, even though it can be removed, and implemented an internal reverse image search to assess the video's provenance.
Despite the fact that many guardrails have been put in place to prevent misuse, the question of how Sora will respond to mass stress-testing remains. Currently, access to Sora is unavailable due to high demand.
Topics Artificial Intelligence OpenAI