On 15 October 2025 at 11:30 GMT, Google revealed Veo 3.1. This release is a significant upgrade from its predecessor, Veo 3. This latest version makes video editing easier than ever before. It creates clips that are more photorealistic and aligns better with user instructions than its predecessor. The update is being released on all platforms. You can start to see it in Google’s federated video editor Flow, Gemini App, Vertex, Gemini APIs.
Veo 3.1 builds on the foundational features of its predecessor, released in May. Users will immediately notice an increased realism of generated clips, making them more colorful and dynamic, and lifelike. From an audio perspective, we accounted for some gaps in the Veo 3. With this new improvement, we’ve taken the whole user experience to the next level.
Veo 3.1 builds upon the new edit features introduced in Veo 3 by letting users add reference images that inform character movement. It allows users to define the start and end frame for AI generated video clips and to loop videos based on the last few frames. These enhanced features offer creators better versatility and command over their video endeavors.
As the continuous rollout of Veo 3.1 shows, Google is still committed to improving the art and science of video editing technology. By adding more sophisticated tools that better serve creators’ needs, the company aims to enhance the creative process for users at all levels.
With Veo 3.1’s increased responsiveness to prompts, users can look forward to even more precision when creating clips. This feature addresses a common challenge faced by video editors: ensuring that the final product aligns with their vision. In this announcement, Google plans on making the production side much easier too.
As you might expect, Google is always at the cutting edge of video production. With the release of Veo 3.1, they’ve set the standard for an entirely different kind of editing tools. Google’s new AI video editing technology fuses next-level realism with cutting-edge audio capabilities. They focus on developing easy-to-use features in order to stay ahead of this dynamic frontier.