LOS ANGELES, January 24, 2026 /PRNewswire/ — At CES 2026, VR filmmaker and educator Hugh Hou led a live spatial computing demonstration within the GIGABYTE suite, showing how immersive video is created in real production environments, not in theory or controlled laboratory conditions.
The session gave attendees an in-depth look at a complete space filmmaking pipeline, from capture to post-production and final playback. Instead of relying on pre-rendered content, the workflow was executed live on the show floor, mirroring the same processes used in commercial XR projects and imposing clear requirements for system stability, performance consistency, and thermal reliability. The experience culminated with attendees watching a two-minute space movie trailer on Meta Quest, Apple Vision Pro and the recently launched Galaxy XR headsets, as well as a 3D tablet screen offering an additional 180-degree viewing option.
Where AI fits into real creative workflows
AI was not presented as a flagship feature, but as a practical tool integrated into everyday editing tasks. During the demo, AI-assisted refinement, tracking, and preview processes helped accelerate iterations without interrupting the creative flow.
Footage captured on immersive, cinema-quality cameras was transferred via industry-standard software, including Adobe Premiere Pro and DaVinci Resolve. AI-driven upscaling, noise reduction and detail refinement have been applied to meet the visual demands of immersive VR, where any artifacts or smoothness become immediately noticeable in a 360-degree viewing environment.
Why Platform Design Matters for Spatial Computing
The entire workflow was supported by a custom-built GIGABYTE AI PC, specifically designed for sustained spatial video workloads. The system combined an AMD Ryzen™ 7 9800X3D processor with a Radeon™ AI PRO R9700 AI TOP GPU, providing the memory bandwidth and continuous AI performance required for real-time 8K spatial video playback and rendering. Just as essential, the X870E AORUS MASTER X3D ICE motherboard provided stable power and signal integrity, allowing the workflow to run predictably throughout the live demo.
The experience concluded with participants watching a completed space movie trailer on the Meta Quest, Apple Vision Pro, and Galaxy XR devices.
By enabling a demanding space filmmaking workflow to run live and repeatedly at CES, GIGABYTE demonstrated how platform-level systems design transforms a complex immersive production into something creators can build on, not just experiment with.
SOURCE GIGABYTE



