Luma AI’s latest video model, Ray3, is now exclusively available in the Firefly app where you can use it to generate videos and explore creative concepts in Firefly Boards.
Following in the wake of last month's introduction of Google’s Gemini 2.5 Flash Image— aka Nano Banana— into Firefly, Adobe says that Firefly customers can now get early access to Luma AI's latest Ray3 model ahead of its broader public release.
For the next two weeks, Ray3 will be available only in the Firefly app and on Luma AI’s Dream Machine platform. And this is interesting because Ray3, as ever, represents advances over the previous generation of AI models.
Ray3 lets you generate cinematic, high-quality 10-second clips, and is one of the first video AI models to support native HDR, delivering richer contrast, deeper shadows, and brighter highlights in professional-grade formats. It's also built on a new multimodal reasoning system that Luma says means it can better understand the user’s creative intent, plan coherent scenes, maintain character consistency, and produce motion that feels natural.
So, that means you have the potential to use Ray3 in Firefly’s Text to Video to quickly generate b-roll or background footage that complements your content, "whether you’re filling gaps in a product tutorial video, adding narrative depth to an Instagram Reel or building dynamic transitions for TikTok."
The distinctions that are made in Adobe's blog post announcing the new integration are interesting and come in the wake of Netflix's latest guidelines on AI usage. Thus the emphasis for filmmakers is about pre-viz.
"If you’re a filmmaker, producer or visual storyteller, you can use Ray3 in Firefly Boards to explore various visual directions before moving forward with your shoot. It’s easy to generate environments, shot compositions and camera perspectives while storyboarding or planning scenes. Whether you’re imagining a chase through a bustling city or a quiet moment in a remote landscape, Ray3 helps you prototype your vision quickly."
However you use it, though, as with all partner models integrated into the Firefly app, nothing you generate in Adobe apps will be used to train generative AI models. All AI-generated content in Firefly also includes Content Credentials so you always know which AI was used to create it, helping to ensure transparency at every step of the creative process.
Ray3 joins OpenAI, Ideogram, Pika, Black Forest Labs, and Runway which are already all available within Firefly, with upcoming integrations from Moonvalley and Topaz Labs coming soon.