MRMUSK.mp4
Besides a silly name, it's also a library for quickly prototyping shared Mixed Reality experiences on Quest devices. It supports colocation for up to 16 users, is compatible with Quest Building Blocks, and is especially well suited for hackathons and college courses.
Ready to get started? Check out our
TODO
YES! And if you fully intend to publish to AppLab or the Quest Store you might want to start with Unity-LocalMultiplayerMR instead. However, keep in mind that project uses Shared Spatial Anchors, and SSA's are incredibly difficult to use in hackathons and college settings. For more information see The Problem with Shared Anchors.
Ideally we'd use a QR code or image target, but since 3rd party apps can't access the camera we need something else that's trackable. Luckily, we have something trackable in the form of controllers. We just need a way to get those controllers into a precise location, and for that we use a simple controller stand.
At app launch, each user places their controller into the stand and presses a button combination to colocate (Button 1 + Button 2 + Trigger by default). After that, the controller can be removed from the stand and the experience proceeds normally.
Advanced Note: Since the camera transform itself is updated during colocation, Mixed Reality experiences can be prototyped quickly in Unity World Space rather than needing to think about dividing the word across anchors. This is quite helpful to developers who are just getting started with Mixed Reality, but learning how to design for spatial anchors is still highly valuable in a curriculum.
We 3D printed this stand (Prusa version) but any stand will do. If you like this stand we recommend scaling it down to 95% before printing. This makes the controller fit nice and snug, which helps improve accuracy for colocating.
No problem! The Offset Settings in Colocation Manager can handle this.
Update Rotation Offset to match the angles of your stand, and optionally use Position Offset to specify a distance between the stand and the floor.
So what is the problem with Shared Spatial Anchors anyway, and why all this controller stuff?
It's actually less of a problem with Shared Spatial Anchors and more a problem with the setup required to use them. To use Shared Anchors, even in a prototype, Meta currently requires developers to:
-
Have a verified developer account (which in turn requires a payment method or phone number).
-
Create an Organization for their company.
-
Understand and complete a Data Use Checkup explaining how user data will be used.
-
Enable “User ID” and “User Profile” in the Data Use Checkup.
-
Create Test User Accounts to use for testing before publishing.
-
Add permissions to the Android manifest (though the SDK helps with this).
-
Enable point cloud sharing on their device.
For hackathons and college settings, the above requirements are often not feasible or require too much time. This library allows projects to start quickly without any of the above requirements.