Skip to content

Ahmev-Ayush/AR-Metaverse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 AR Metaverse

An open-source, budget-friendly Spatial Computing workspace for Android.

🎯 The Ultimate Aim

The ultimate goal of this project is to democratize spatial computing and multi-monitor productivity.

High-end AR/VR headsets like the Meta Quest 3 and Apple Vision Pro offer incredible productivity features, allowing users to surround themselves with multiple massive virtual displays. However, these devices cost hundreds or thousands of dollars.

AR Metaverse aims to bring that exact same futuristic, limitless workspace to the hardware people already own: a standard laptop and an Android smartphone. By combining AR tracking with zero-latency WebRTC streaming and virtual display drivers, this project transforms a single-screen laptop into an expansive, multi-monitor AR workstation that floats in your physical room—all at zero cost.

🎬 Demo

AR Metaverse Workspace Demo

(Wait while GIF is loading!) If the video does not render on your platform, open it directly here: ARWorkspace.mp4

✨ Core Features

  • Virtual Multi-Monitor Setup: Bypasses hardware limitations to create multiple virtual desktop screens from a single laptop.
  • Zero-Latency Streaming: Utilizes WebRTC (via Unity Render Streaming) for real-time, peer-to-peer video transmission, ensuring that typing and mouse movements feel instantaneous.
  • Spatial Anchoring: Built with AR Foundation to seamlessly track your physical environment and lock virtual screens to your real-world desk.
  • Highly Accessible: Designed to run on standard Android phones, either handheld or slipped into a budget-friendly VR/AR phone enclosure (like Google Cardboard).

🛠️ Tech Stack

  • Game Engine: Unity 3D
  • AR Framework: AR Foundation / Google ARCore
  • Networking: WebRTC / Unity Render Streaming / Web App

📂 Project Structure

  • Assets/: Unity project assets including AR templates, scripts, scenes, and prefabs.
  • web app/: The web application serving as the WebRTC signaling/streaming endpoint.
  • DEVLOG.md: Tracking ongoing development progress.

🚀 Getting Started

  1. Must install necessary packages like render streaming (check official website [https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/index.html] )
  2. Open the Unity project (AR Metaverse) in Unity Hub with Android Build Support.
  3. Run the signaling server located in the web app/ directory.
  4. Build the Android .apk and deploy it to your ARCore-compatible smartphone to start your spatial workspace!
  5. Open screenshare.html at ./web app\public1 and Enter the IP as shown in web app server.
  6. Ensure the IP is same as Project Settings > Render Streaming > URL.

Scenes in the project

  1. ARMetaverseScene (main scene)
  2. ARMetaverseSceneModified - AR Session Tracking mode = Rotation Only & No plane detection! (highly optimized for less CPU usage).
  3. SplitScreenScene1 - (formerly ARMetaverseScene1): Stereoscopic rendering (two cameras) on canvasDisplay.
  4. SplitScreenScene2 - (formerly ARMetaverseScene2): Stereoscopic rendering (two cameras) without canvasDisplay (modified for testing on Google Cardboard).
  5. TestStreaming - Scene for Unity Render Streaming tutorial.
  6. VRMetaverseScene - : Duplicated from ARMetaverseSceneModified to remove AR Camera Background and test its effect on processing.
  7. AndroidAppScene & WindowsAppScene - Setup for remote rendering. only browser streaming working
  8. AndroidAppScene 1 & WindowsAppScene - Testing simultaneous active connections (Browser -> WindowsApp -> AndroidApp). [Working!!!!!]

About

Open-source, budget-friendly spatial computing workspace for Android that turns a laptop + phone into a multi-monitor AR workstation using Unity, AR Foundation, and WebRTC.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors