This post was co-authored by Craig Herndon, an XR Terra instructor.
Unity’s Mixed and Augmented Reality Studio (MARS), one of this year’s most anticipated releases in the XR development ecosystem, finally launched earlier this month. For many developers, the initial sticker price of $50/month or $600/year has shifted the discussion away from a much-needed feature set to one of skepticism. The focus has turned away from “what is MARS” toward the question of “is MARS worth it”? Here at XR Terra, we spent some time with MARS and would like to provide creators and businesses with some information that will help make that decision.
To make that point, we will be covering the following features in Unity’s MARS:
- Simulation View
- Proxy-based Workflow
- Fuzzy Authoring
- MARS Session
The biggest feature MARS provides is the Simulation View that comes with environment templates with the ability to test an AR application in a wide variety of settings such as a home, factory, office, or outside. You can even scan or model your own simulation environment. Before we get into how this works, let us look at the development workflow prior to MARS.
Without MARS, once the App is ready for testing in Unity the developer would need to deploy to the device which depending on your PC specs and whether the target device is android versus iOS can take anywhere from 1 – 5 minutes. This alone is a large loss of time for any developer.
Once the app is deployed to the device, the developer then needs to take their phone and scan for the appropriate surface they are testing. Chances are that the desk or table in the office is not the same as what the end-user will be using. For example, if the expected use case is a factory then the developer will not be able to test reliably at all. This step can take another 2-5 minutes depending on what is being tested.
Once this workflow is completed the developer can now see the effect of the changes they make in their code – for example, if a tweak to their code worked, or if the object appears correctly – which would take at least another minute of time.
Best case scenario, the developer can make 15 changes in an hour. Worst case scenario, they can only make 4 changes in that hour. On average, most developer’s would fall in the 7 to 10 changes per hour range.
Keep in mind that the above times are for the happy path where everything is going smoothly, and you aren’t trying to track down a weird bug with the Android Debugger Bridge or the low-level debugger on iOS.
With MARS, developers can test by cutting out steps 2 and 3 by using the simulation environments. In the MARS sample, we have an energetic robot collecting blue crystals. The robot spawns at the first area scanned by the phone and the crystals spawn based on the type of surface the developer wants them to spawn on.
Proxy-based Workflow & Fuzzy Authoring
MARS provides developers with a Proxy script that allows uses to set the criteria where they want their objects to spawn – in this example, this is a flat horizontal surface of 2 ft by 3 ft or a vertical surface such as wall (see proxy example below). Instead of using precise or exact measurements, developers can set minimum and maximum conditions which Unity is referring to as “Fuzzy Authoring”. Once the proxy is set up a user can attach content that appears once this proxy is located in the environment. MARS also allows the creation of proxy groups that require multiple surfaces to be present for the app to work.
Users can then use the compare tool to see where in the simulation environment these conditions are met. (See screenshot of the comparison with the above.) We will dig deeper into how to set these up in a future blog post.
The powerful feature here is that by the click of a button the developer can visualize where these conditions are met and what types of surfaces the crystals will appear. If the developer had to use the original workflow then this could have taken 15- 20 minutes per location to walk around scanning the environment. With MARS this is now only the click of a button and instantly viewable.
Next, if the developer wishes they can also scan the virtual environment as if they were holding a mobile device. Check out the gifs below to see what this looks like.
Another difficult problem AR developers face is scaling content relative to the real world. The MARS Session allows you to adjust the entire scene scale with a simple slider. This is useful because it is better to create content that can adapt to the user’s environment than require the user to find an appropriate environment for your application.
MARS has many more features that we are excited to dig into and will cover in future blog posts.
This initial scratch of the surface shows you that MARS can easily save several hours per day/week (depending on what and how much you’re doing) for creators and developers. This time can then be spent making a more robust application for a wide variety of environments and conditions with many more unique interactions.
Our vote on this is that it’s absolutely worth the additional fee to use Unity’s MARS, especially for professional developers and creators. Moreover, the 45 day trial period is an excellent opportunity for creators to get a hands-on feel and help them make the decision.
At XR Terra, we are very excited about us and our AR & VR Developer Program students using Unity’s MARS for their Augmented Reality and Virtual Reality industry projects! We will share our challenges and insights using MARS and other AR VR tools in future blog posts. So stay watching and let us know if you want to learn about a specific AR VR solution!