The new ‘open’ AI models from Gemma can interpret images and short videos in addition to text, and run
The new ‘open’ AI models from Gemma can interpret images and short videos in addition to text, and run
Mar 12, 2025, 7:46 AM UTC
Google Gemma
Google Gemma
Richard Lawler
Richard Lawler is a senior editor following news across tech, culture, policy, and entertainment. He joined The Verge in 2021 after several years covering news at Engadget.
A little over a year after releasing two “open” Gemma AI models built from the same technology behind its Gemini AI, Google is updating the family with Gemma 3. According to the blog post, these models are intended for use by developers creating AI applications capable of running wherever they’re needed, on anything from a phone to a workstation with support for over 35 languages, as well as the ability to analyze text, images, and short videos.
The company claims that it’s the “world’s best single-accelerator model,” outperforming competition from Facebook’s Llama, DeepSeek, and OpenAI for performance on a host with a single GPU, as well as optimized capabilities for running on Nvidia’s GPUs and dedicated AI hardware. There’s a 26-page technical report available that goes deeper into those claims.
Last year it was unclear how much interest there would be in a model like Gemma, however, the popularity of DeepSeek and others shows there is interest in AI tech with lower hardware requirements. Despite its claims of advanced capabilities, Google also says, “Gemma 3’s enhanced STEM performance prompted specific evaluations focused on its potential for misuse in creating harmful substances; their results indicate a low risk level.”
What exactly constitutes an “open” or “open source” AI model remains a topic of debate, and with Google’s Gemma, that has focused on the company’s license that restricts what people are allowed to use it for, which has not changed with this new release. Google is continuing to promote Gemma with Google Cloud credits, and the Gemma 3 Academic program will allow academic researchers to apply for $10,000 worth of credits to accelerate their research.
Wes DavisMar 9
Wes DavisMar 4
Wes DavisMar 3
Most Popular