
Open-source Meta Quest 3 app that applies real-time AI style transformation to passthrough camera footage at 150–200ms latency.
Some links may be affiliate links. We may earn a small commission at no extra cost to you. Learn more

Open-source Meta Quest 3 app that applies real-time AI style transformation to passthrough camera footage at 150–200ms latency.
Category
AI Simulation
Decart XR is free to use. The client app is open-source under MIT license on GitHub and available as a precompiled sideload via SideQuest. The Decart AI service that processes the video transformation is governed by separate terms at decart.ai. Requires a Meta Quest 3 headset.
| Plan | Details |
|---|---|
| Free | Free to use. App available via SideQuest for Meta Quest 3 sideloading. Client source code on GitHub under MIT license. Decart AI service terms apply separately. |
Quick Summary
Decart XR is an open-source AI application for Meta Quest 3 that applies real-time video-to-video style transformation to the headset's passthrough camera feed, letting users see their physical surroundings reskinned in one of 61 premade visual styles — from Studio Ghibli and cyberpunk to fantasy and medieval environments — or described through custom voice prompts. Developed by Decart AI and released in October 2025, the app processes live camera frames at 30fps and 720p resolution with approximately 150 to 200 milliseconds of end-to-end latency. The source code is publicly available on GitHub under MIT license, and the precompiled app can be sideloaded on Meta Quest 3 via SideQuest.
Associated Tags
Meta Quest 3 AI, VR style transformation, real-time AI video, passthrough AI app, open-source VR AI, video to video AI, mixed reality AI
How professionals leverage Decart XR – Real-Time AI Style Transformation for Meta Quest 3 via Passthrough

Reviewed by Sohail Akhtar
Lead Editor & Founder
What we like
Limitations
Who should use Decart XR?
Open-source AI model by Tencent that generates explorable, interactive 3D worlds from text or image inputs using panoramic scene reconstruction.
Multimodal AI world model by World Labs that generates persistent, navigable 3D environments from text, images, video, or 3D layouts, with in-scene editing and
Open-source AI world model by Decart and Etched that generates real-time Minecraft-style interactive gameplay at 20 FPS using next-frame prediction, with no tra
Skywork AI's 1.8B open-source interactive world model generating real-time 25 FPS gameplay from keyboard and mouse inputs, with long-sequence consistency and fr