Honda National Dealer Meeting 2025
AI Pipeline for the Honda Base Station Walk-Around
CONNECT
Open for work. email: raycollide.visuals@gmail.com
For the 2025 Honda National Dealer Meeting, I created the outdoor camping landscapes for the walk-around reveal of the new Honda Base Station camper — a cinematic piece built for the venue’s full LED stage at 7040×1920 across a center screen and two offset downstage panels.
The pipeline bridged generative AI imagery, animation, upscaling, and traditional motion design. For animation, I chose Google Flow over Midjourney — cleaner motion, less morphing.
The venue ran three LED screens as a single canvas:
· Screen #1 (SR) — 1280 × 1920, 15'8" W × 23'7" H
· Screen #2 (Center) — 4480 × 1920, 55' W × 23'7" H
· Screen #3 (SL) — 1280 × 1920, 15'8" W × 23'7" H
Total canvas: 7040 × 1920 at 30fps.
01 — BASE IMAGE (NANO BANANA)
Nano Banana generated the base landscape frames. Strong composition, clean lighting, the right mood for a Basecamp environment.
The catch: Nano Banana exports at 1280×720. Useful as a seed, but nowhere near enough resolution for a wide camera shot that would eventually need to live on a 7040-pixel canvas. Trying to generate a wide shot natively in Nano Banana would have meant upresing a low-detail image — a losing battle downstream.
So Nano Banana's job was strict: deliver the strongest possible 1280×720 base
02 — WIDE-FORMAT EXTENSION (PHOTOSHOP + FIREFLY)
Photoshop solved what Nano Banana couldn't. Each 1280×720 base was placed strategically into a 3840×2160 canvas. Firefly's Generative Expand was prompted to extend the landscape into the surrounding empty space.
The result: high-detail, wide-format frames built from a small seed without sacrificing fidelity. Beautiful wide landscapes that held up to scrutiny — and to upscaling — far better than anything generated wide from scratch.
This stage was the unlock for the entire pipeline.
03 — ANIMATION (GOOGLE FLOW)
Google Flow handled the motion. Each prompt was directed for a static camera with naturalistic landscape movement and dramatic timelapse-speed cloud motion.
The deliberate creative constraint here was time. Flow caps generations at 8 seconds — not enough for a reveal sequence.
The workaround: prompt for timelapse-speed motion, generate compressed, time-rich clips inside the 8-second limit, then expand them in post.
A locked camera on a moving sky and shifting landscape gave the sequence a cinematic, almost meditative pace once the timing was redistributed downstream.
04 — UPRES & RETIME (TOPAZ VIDEO AI)
Topaz did three jobs in one pass.
First, resolution. The Flow output was upscaled to 7040 × 3960 — preserving 16:9 throughout the workflow rather than committing to the final 7040×1920 letterbox at this stage. That extra vertical headroom became important downstream.
Second, retime. The timelapse-speed Flow clips were slowed 3 to 4× to land at natural playback speed without losing the time-richness that the 8-second cap forced into the original generation.
Third, framerate. The footage was conformed from 24fps to the show's 30fps deliverable.
One stage, three problems solved.
05 — EDIT, COLOR & DELIVERY (AFTER EFFECTS)
The 7K plates came into After Effects for color correction, integration, and final composition for the reveal sequence.
Two masters were delivered to the show's lead editor:
· 7040 × 3960 — a 16:9 master with full vertical headroom, allowing the editor to reposition the frame up or down within the final 7040×1920 stage canvas as needed.
· 7040 × 1920 — a pre-cropped master matched exactly to the stage canvas, ready to drop into the timeline with zero adjustment.
Two files, two use cases. The editor stayed flexible without redoing any work upstream.
06 — FINAL SEQUENCE FEATURING MY AI ANIMATED LANDSCAPE PLATES
The final sequence as it appeared at the event pulled from documentary footage of the real show.
Roughly 30 generated, 14 used in the final sequence, 6 featured here.
My landscape plates were handed off to a master editor, who composited them in Premiere Pro alongside 3D scenes, video footage, images, and graphical elements from other contributors.
REFLECTION
Every stage of this pipeline involved pushing a tool past its native output. Nano Banana too small, Firefly extending where Nano Banana couldn't go wide, Flow capped at 8 seconds, Topaz solving resolution and time and framerate in one pass, After Effects unifying everything for a 7040-pixel stage.
The lesson the pipeline kept reinforcing: AI tools don't yet deliver final-format output for architectural-scale work, but they don't have to. Knowing where each tool's ceiling is — and where the next tool's strength begins — is the actual craft.
A small generated image, extended thoughtfully, animated within constraint, upscaled with intent, and finished traditionally, can land at full architectural scale without ever feeling generated.
TOOLS USED
Gemini · Nano Banana · Adobe Photoshop · Adobe Firefly · Google Flow · Topaz Video AI · Adobe After Effects
CREDITS
Concept, Prompt Engineering & AI Art Direction, Motion Design, and Post Compositing — Ray Oasay
Production Company: Martin Brinkerhoff Associates (MBA Productions)
Client: Honda
Event: Honda National Dealer Meeting
Year: 2025