Skip to content
Pricing: Free
Verified: Yes

Open-source Meta Quest 3 app that applies real-time AI style transformation to passthrough camera footage at 150–200ms latency.

Category

AI Simulation

View all AI Simulation tools
Verified Selection
Updated Recently
Community Reviewed

Pricing

Decart XR is free to use. The client app is open-source under MIT license on GitHub and available as a precompiled sideload via SideQuest. The Decart AI service that processes the video transformation is governed by separate terms at decart.ai. Requires a Meta Quest 3 headset.

PlanDetails
FreeFree to use. App available via SideQuest for Meta Quest 3 sideloading. Client source code on GitHub under MIT license. Decart AI service terms apply separately.

What is Decart XR?

Quick Summary

Decart XR is an open-source AI application for Meta Quest 3 that applies real-time video-to-video style transformation to the headset's passthrough camera feed, letting users see their physical surroundings reskinned in one of 61 premade visual styles — from Studio Ghibli and cyberpunk to fantasy and medieval environments — or described through custom voice prompts. Developed by Decart AI and released in October 2025, the app processes live camera frames at 30fps and 720p resolution with approximately 150 to 200 milliseconds of end-to-end latency. The source code is publicly available on GitHub under MIT license, and the precompiled app can be sideloaded on Meta Quest 3 via SideQuest.

Decart XR is a Unity-based Meta Quest 3 application that captures live video from the headset's passthrough cameras, streams it over WebRTC to Decart AI's server-side video-to-video neural network, applies an AI style transformation, and superimposes the processed output back into the user's view in near real time. The application's primary mode, Warp the World, uses Decart's Mirage model offering 61 premade world transformation prompts plus unlimited custom voice-described styles. A secondary mode, Swap Your Friend, uses the Lucy model to apply 15 person-specific transformations to people in the camera view. Processing runs at 30fps with a VP8-encoded WebRTC video pipeline and delivers end-to-end latency of approximately 150 to 200 milliseconds. The current implementation transforms a rectangular 2D viewport within the headset's view rather than a full stereoscopic 3D environment, which is a structural limitation of the current architecture. Decart XR is primarily of interest to VR developers, XR researchers, and technology enthusiasts who want to explore real-time AI video-to-video transformation on consumer VR hardware. Developers use it as a technical reference for building Quest 3 applications that combine passthrough camera access, WebRTC streaming, and AI service integration — Decart's team published a detailed technical breakdown of the architecture on their developer cookbook. Researchers investigating the feasibility of real-time AI scene transformation in mixed reality contexts use it as a working implementation to study latency, consistency, and practical constraints. The fully open-source client codebase, available on GitHub under MIT license, makes it a concrete starting point for developers building on similar VR AI interaction concepts. Decart XR is free to use and is available on the web at vr.decart.ai and as a sideloadable Quest 3 app via SideQuest. The client code is open-source under MIT license, though the Decart AI service it connects to is governed by separate terms of service. The app requires a Meta Quest 3 — it does not support Quest 2 or other headsets — and sideloading requires enabling developer mode on the device. Independent testing has noted that in practice the latency can be perceptible to higher-end than the specified range, and the 2D viewport framing means the style transformation does not fill the full field of view, making the current version a compelling technical demonstration rather than a polished consumer product.

Associated Tags

Meta Quest 3 AI, VR style transformation, real-time AI video, passthrough AI app, open-source VR AI, video to video AI, mixed reality AI

Key Features

Real-time AI style transformation via passthrough
61 premade world transformation styles
Custom style input via voice prompt
Lucy model for person-specific transformations
30fps WebRTC video pipeline at 720p
150 to 200ms end-to-end processing latency
Open-source MIT client on GitHub

Real Use Cases

How professionals leverage Decart XR – Real-Time AI Style Transformation for Meta Quest 3 via Passthrough

Decart XR – Real-Time AI Style Transformation for Meta Quest 3 via Passthrough use cases
  • Experiencing a familiar room reskinned as a cyberpunk environment, anime world, or medieval scene by pointing a Quest 3 at the physical space and selecting a preset style
  • Providing a custom voice prompt describing a specific visual environment and watching the passthrough view transform toward that description in real time
  • Using the application as a technical reference for building Meta Quest 3 apps that combine passthrough camera access, WebRTC streaming, and AI video processing
  • Demonstrating real-time AI video-to-video transformation capabilities in a VR context at developer meetups, research showcases, or technology events
  • Forking the open-source client on GitHub to experiment with modified WebRTC parameters, custom style models, or extended camera access implementations
  • Testing the practical latency and consistency of AI style transformation on consumer VR hardware as part of an XR research or feasibility study

Editor's Verdict

Official Review
Decart XR is a technically credible first step toward real-time AI environmental style transformation on consumer VR hardware, offering a working passthrough-to-AI video pipeline on Meta Quest 3 with an open-source client that makes it genuinely useful for developers and researchers. The current 2D viewport framing and perceptible latency make it a developer preview and research tool rather than a finished immersive experience.

Reviewed by Sohail Akhtar

Lead Editor & Founder

Pros

What we like

  • A working real-time AI video transformation pipeline on a consumer VR headset is technically significant — Decart XR achieves a genuinely novel interaction concept with sub-200ms processing latency on available Quest 3 hardware using a server-side AI service
  • Full MIT-licensed open-source client code on GitHub with detailed technical documentation makes it a practical starting point for developers building on similar Quest 3 camera-plus-AI-service architectures
  • Free access via SideQuest with no account or subscription required for the app itself allows any Quest 3 developer or enthusiast to evaluate the technology firsthand without cost

Cons

Limitations

  • The style transformation applies to a 2D rectangular viewport within the headset's view rather than a full stereoscopic 3D environment, which means the current experience functions more like a transformed video window than an immersive reskinned world
  • Independent testing has found practical latency can exceed the specified 150 to 200ms range, and the 720p resolution at 30fps limits visual fidelity compared to the native passthrough quality — making it a compelling technical demonstration rather than a polished consumer application in its current state

Target Audience

Who should use Decart XR?

VR developers and Unity developers exploring Quest 3 passthrough camera access, WebRTC streaming, and AI service integration patternsXR and mixed reality researchers studying real-time AI video transformation feasibility on consumer headsetsTechnology enthusiasts with a Meta Quest 3 who want to experience AI-generated visual style transformations applied to their physical environmentDevelopers looking for an open-source starting point for Quest 3 applications involving live camera AI processingAI and creative technologists building or researching real-time video-to-video neural network applications for interactive media
Freemium
Hunyuan World 1.0

Hunyuan World 1.0

Open-source AI model by Tencent that generates explorable, interactive 3D worlds from text or image inputs using panoramic scene reconstruction.

Free
Marble by World Labs

Marble by World Labs

Multimodal AI world model by World Labs that generates persistent, navigable 3D environments from text, images, video, or 3D layouts, with in-scene editing and

Free
Oasis AI Game

Oasis AI Game

Open-source AI world model by Decart and Etched that generates real-time Minecraft-style interactive gameplay at 20 FPS using next-frame prediction, with no tra

Free
Matrix-Game 2.0

Matrix-Game 2.0

Skywork AI's 1.8B open-source interactive world model generating real-time 25 FPS gameplay from keyboard and mouse inputs, with long-sequence consistency and fr

Frequently Asked Questions

What is Decart XR?
Decart XR is a free, open-source Meta Quest 3 application that uses Decart AI's video-to-video neural network to apply real-time style transformations to the headset's passthrough camera feed, reskinning the user's physical surroundings in AI-generated visual styles.
What headsets does Decart XR support?
Decart XR is specifically designed for Meta Quest 3. It requires the Quest 3's passthrough camera capabilities and does not support Quest 2 or other VR headsets.
Is Decart XR free to use?
Yes. The client app is free and available via SideQuest for sideloading on Meta Quest 3. The source code is open-source on GitHub under MIT license. The Decart AI processing service is governed by separate terms at decart.ai.
How does Decart XR work?
Decart XR streams live passthrough camera footage over WebRTC to Decart's server-side AI model, which applies a style transformation and returns the processed video for display in the headset — completing the cycle in approximately 150 to 200 milliseconds at 30fps and 720p.
How many styles does Decart XR include?
The Warp the World mode includes 61 premade visual transformation styles and supports unlimited custom styles via voice prompt. The Swap Your Friend mode offers 15 person-specific transformations via the Lucy model.
Who should use Decart XR?
Decart XR is best suited for VR developers and XR researchers who want to explore real-time AI video transformation on Quest 3, and for technology enthusiasts with a Quest 3 who want to experience AI-driven style transformation applied to their real-world environment.