Last Updated:
OpenAI introduced the AI video platform earlier this year but is now extending its reach to more users.
OpenAI has opened the gates for its AI video generator platform Sora, allowing the public to try out the tool on their devices. Sora is basically OpenAI’s version of AI tool that lets you create real-like videos from text prompts. According to the company, Sora, which means “sky” in Japanese, is “able to generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background.”
OpenAI introduced Sora in February this year but its availability was limited to select groups. It seems the company is now happy with the progress made by the platform, now allowing more people to start using Sora.
OpenAI Sora AI Video Generator: What It Offers
Now, picture yourself writing a prompt for an imaginative scenario, such as envisioning a cat dancing atop a spaceship. Sora excels in translating these prompts into detailed and realistic videos. The company also claims that the “model understands not only what the user has asked for in the prompt but also how those things exist in the physical world.”
The new Sora version has been boosted to generate 1080p video quality but only for a 20 second clip. You can create it in a widescreen or vertical format and you can add external music files and remixes to personalise the content. OpenAI has also added a Recent and Featured tab to get access to content from others in the Sora community.
Sora is coming to ChatGPT Plus subscribers for free, giving them the ability to create 50 videos in 480p quality, and less number of videos if they want it in 720p or HD resolution. OpenAI also has a new ChatGPT Pro model that promises 10x more usage, support for higher resolution and video clips generated for longer durations.
OpenAI assures that all the AI content will be rightly watermarked and tagged with the details to alert people about its genre. The company is also working towards making Sora accessible to more people which should happen sometime next year.