An open-source, budget-friendly Spatial Computing workspace for Android.
The ultimate goal of this project is to democratize spatial computing and multi-monitor productivity.
High-end AR/VR headsets like the Meta Quest 3 and Apple Vision Pro offer incredible productivity features, allowing users to surround themselves with multiple massive virtual displays. However, these devices cost hundreds or thousands of dollars.
AR Metaverse aims to bring that exact same futuristic, limitless workspace to the hardware people already own: a standard laptop and an Android smartphone. By combining AR tracking with zero-latency WebRTC streaming and virtual display drivers, this project transforms a single-screen laptop into an expansive, multi-monitor AR workstation that floats in your physical room—all at zero cost.
(Wait while GIF is loading!) If the video does not render on your platform, open it directly here: ARWorkspace.mp4
- Virtual Multi-Monitor Setup: Bypasses hardware limitations to create multiple virtual desktop screens from a single laptop.
- Zero-Latency Streaming: Utilizes WebRTC (via Unity Render Streaming) for real-time, peer-to-peer video transmission, ensuring that typing and mouse movements feel instantaneous.
- Spatial Anchoring: Built with AR Foundation to seamlessly track your physical environment and lock virtual screens to your real-world desk.
- Highly Accessible: Designed to run on standard Android phones, either handheld or slipped into a budget-friendly VR/AR phone enclosure (like Google Cardboard).
- Game Engine: Unity 3D
- AR Framework: AR Foundation / Google ARCore
- Networking: WebRTC / Unity Render Streaming / Web App
Assets/: Unity project assets including AR templates, scripts, scenes, and prefabs.web app/: The web application serving as the WebRTC signaling/streaming endpoint.DEVLOG.md: Tracking ongoing development progress.
- Must install necessary packages like render streaming (check official website [https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/index.html] )
- Open the Unity project (
AR Metaverse) in Unity Hub with Android Build Support. - Run the signaling server located in the
web app/directory. - Build the Android
.apkand deploy it to your ARCore-compatible smartphone to start your spatial workspace! - Open screenshare.html at ./web app\public1 and Enter the IP as shown in web app server.
- Ensure the IP is same as Project Settings > Render Streaming > URL.
- ARMetaverseScene (main scene)
- ARMetaverseSceneModified - AR Session Tracking mode = Rotation Only & No plane detection! (highly optimized for less CPU usage).
- SplitScreenScene1 - (formerly ARMetaverseScene1): Stereoscopic rendering (two cameras) on canvasDisplay.
- SplitScreenScene2 - (formerly ARMetaverseScene2): Stereoscopic rendering (two cameras) without canvasDisplay (modified for testing on Google Cardboard).
- TestStreaming - Scene for Unity Render Streaming tutorial.
- VRMetaverseScene - : Duplicated from ARMetaverseSceneModified to remove AR Camera Background and test its effect on processing.
- AndroidAppScene & WindowsAppScene - Setup for remote rendering. only browser streaming working
- AndroidAppScene 1 & WindowsAppScene - Testing simultaneous active connections (Browser -> WindowsApp -> AndroidApp). [Working!!!!!]
