ANUJMATION

MOTION – GEN AI SOLUTIONS

Leverage with latest technology to deliver innovative design solutions to push the boundaries of creativity!

AI Directed Brand Film

When a script becomes a film, without any camera

The brief arrived as words on a page. The challenge was to turn that script into a realised brand film. With consistent characters, immersive environments, a composed soundscape and cinematic visual quality. Using only AI tools and creative direction. No location scouts. No shoots. No post-production studio. Just a pipeline built from the ground up around intelligence.

The tools do not replace the director. They replace the camera crew. Creative judgement still drives every decision. The AI tools just execute at a speed and scale that was not possible before.

How the brand film was made

01

The script. The foundation of the brand film

Everything started with a script. Than treating it as a loose brief we used it as a guide. Breaking it down scene by scene to extract tone, pacing, character behaviour and visual intention. The script became the source of truth that every tool and decision downstream had to serve.

02

Storyboard generation for the brand film

The script was translated into a visual storyboard using AI image generation. Each panel capturing camera angles, composition and mood before any video was produced. This stage functioned like a storyboard review: we locked the story visually before committing to production. It also gave the client a preview to sign off on early.

03

Creating characters for the brand film. Using Leonardo AI and Midjourney

Consistent on-brand characters were developed using Leonardo AI and Midjourney. We ran iterations to establish visual identity. Facial structure, costume, lighting response. And then locked them into reference sheets that guided all subsequent generation. Character consistency across scenes one of the problems in generative video was solved through careful planning and image-to-image workflows.

04

Building environments and scenes for the brand film. Using ComfyUI and Midjourney

Backgrounds and environments were built as full compositional scenes. Not simple backdrops. ComfyUIs node-based pipeline allowed control over lighting, atmosphere and depth while Midjourney generated wide establishing shots with cinematic scale. We designed the environments to feel like locations not just generated textures.

05

Generating video for the brand film. Using Sora and Veo 3.1

Storyboard panels and character references were fed into Sora. Veo 3.1 to generate the motion sequences. Each clip was directed with instructions. Controlling camera movement, subject action and scene duration. We generated multiple takes per scene. Selected the best performances just like in a traditional edit suite workflow. Veo 3.1s audio-aware generation also helped us time the scenes against the score.

06

Creating the background score, for the brand film. Using AI music composition

The brand films emotional arc was underscored with a background score generated using AI music tools. Timed and tuned to the edit. Than using stock music the score was composed specifically for this narrative shifting in texture and intensity to match the brand films beats. The result is a piece of music that feels like it was written for the brand film not just assembled from parts.

Tools and Pipeline

Sora
Veo 3.1
Comfy UI
Midjourney
Leonardo AI
AI Music Gen
Prompt Engineering
Img2Img Workkflow

Pipeline Phases

Script

Storyboard

Characters

Environments

Video Gen

Score

Phase 1

Phase 2

Phase 3

Phase 4

Phase 5

Phase 6

AAA Game Intro : Cinematic Intro Sequence

Bangalore AAA Gaming Studio wanted an intro that would match the high quality visuals players expect from a top notch game. The goal was clear: show the world, characters. Create tension. All in the first few seconds. Anujmation handled the project from start to finish managing every step in-house.

“Every great game intro starts with a story not software. We built this frame by frame.”

01

Storyboard

The sequence began as a digital hand-drawn storyboard. Every camera angle, character movement and environmental change planned out before modelling began. This step set the pace, story flow and emotional tone giving the team a shared guide.

02

3D modelling at the same time

While refining the storyboard 3D modelling started simultaneously. Characters, props and environments were built at the time. A process that sped up the timeline without losing detail. Asset libraries were organised from the start for integration into Maya and Unreal Engine.

03

Animation in Maya

Character animation was created in Autodesk Maya, where performance could be fine-tuned. Weight, timing and secondary motion adjusted until each scene felt realistic. Rigs were built for in-engine performance allowing export into the real-time pipeline.

04

In-game rendering, with Unreal Engine

The animated scenes were brought into Unreal Engine for real-time rendering. Unreals Lumen illumination and Nanite geometry system gave the sequence a cinematic look that traditional pipelines struggle to match. Lighting, shaders and environmental effects were refined directly in-engine.

05

Live video capture and compositing

Live footage was. Added to the composite during the final stage. Making the digital world feel more real and grounded. The mix of action and 3D was handled frame by frame with colour grading and motion tracking used to make the joins seamless. The result is a sequence that feels both epic and genuine.

Tools and Pipeline

AutoDesk Maya
Unreal Engine 5
Substance Painter
Adobe Package
Compositing
Motion Tracking
Color Grading

Pipeline Phases

Storyboard

3d Modelling

Animation

Engine Render

Compositing

Phase 1

Phase 2

Phase 3

Phase 4

Phase 5

AI Timelapse Architecture Villa

Casual Games: Promotional Videos

Nachos Marketing Ad

Thumbs up Marketing Ad

Scroll to Top