Virtual reality

Deep Dive: How Owlchemy adapted its VR titles for Apple Vision Pro

Game Deep Dives is an ongoing series with the goal of shedding light on specific design, art, or technical aspects within a video game in order to demonstrate how seemingly simple decisions, of design is not really easy at all.

Earlier installments cover topics such as how GOG improved the imperfect with the re-release of Alpha Protocol, how the Games of Ishtar created a new race of dwarfs in the The Last Spelland how Krillbite Studio has created a fun food preparation experience with Fruit bus.

In this edition, the Owlchemy Labs team tells us in depth about the technical challenges of running their VR titles on Apple Vision Pro.

The launch of the Apple Vision Pro in February marked a big moment in the VR community by being the first large six-degree-of-freedom headset to ship without controllers.

Senior platform engineer Phillip Johnson is happy to explain how we brought it Job Simulator and Vacation Simulator in Apple Vision Pro. We’ll walk through the methods we used to implement hand tracking and look at the challenges we faced with shaders and the audio system. By sharing our experience, we hope to see great fully immersive titles come to the visionOS platform.

30hz depending on the hand in a 90hz game

Undoubtedly, the biggest challenge we faced during the entire production of this port was to pay for the renewal of the hand tracking at 30hz. Both Job Simulator and Vacation Simulator they are deep communication experiences. Updating the handles only once every three frames had a few side effects when we started working on these ports. Grabbing and throwing things was almost impossible. Hand speed would be exaggerated, causing fragile objects like plates to break in our hands. We can also lose track when we look away from our hands. Our titles were unplayable, and it was unclear when an update would be available, so our team set out to solve our hand tracking problems with what was available in time. that.

Senior game engineer Greg Tamargo envisions a simple way to hand-track with extremes.

Due to manual tracking being updated at 30hz while the game is being updated at 90hz, for every frame with handheld data, there will be at least two frames with no data. completely improved. Because of this, we had to change the Unity XR VisionOS package to tell us if the data was “new” or “stale” and to pay accordingly. We found that overlaying “dead” frames by interpolating between back-to-back “hands” was too slow and felt unresponsive, so we chose to use extrapolation to tell if the hands Where will he be as we speculate on the next “new” one. “The hand post can be the first to happen. Anyone with experience with online multiplayer games is probably familiar with this technique. By keeping track of at least two newly created hands, we can measure the velocity and angular velocity and use those to calculate the position, based on how much time has passed since the last frame of the new data made a significant improvement in performance and game feel. It is also important to note that by keeping track of additional frames of new hand data, we can create complex definitions to predict where the data will be, but when we implemented this, it wasn’t immediately obvious how much this improved the game.

Regardless, implementing this add-on function on the wrist has made a huge improvement. When we tried to continue this process with other hand points to control the movement of each finger, the results were very limited. So, we decided to try something else to ease the condition of the finger joints.

4Office_2560x1440.png

Professional engineer and hands-on expert, Marc Huet, provides insight into the decisions we made about the settings.

For the self-produced hands, we wanted to avoid the possibility of doing random things, so we tried to work with real post data from the device as much as possible instead of doing it automatically us.

To solve the frequency of low updates, we have introduced a delay so that we can mix a combined update between the two latest levels while waiting for the next one. To ensure that this delay does not negatively affect the game, we only use the newly received state when we see actions such as “pick up” and “release,” while the smooth area is reserved for hand model production.

We also took a careful approach to fill in the gaps when individual joints were lost to tracking. Instead of trying to create a new database for the missing join and IK, we copy the missing parent-child relationships from the previous view while leaving all the parent-child relationships intact (see figure).

Both methods were greatly simplified by maintaining and working with joint orientations relative to the parent joint rather than relative to the hand or world origin.

OwlchemyLabs_TrackedJoints.PNG

Apple has since announced that it will support 90hz hands in the VisionOS 2.0 update, and we’ll be sure to update our news when that update rolls out.

Building humor and humor

Unity also compiles and caches shaders the first time they are generated. This collection causes short bursts of framerates, which is unacceptable on a local platform as it causes motion sickness. Due to the spatial nature of VisionOS, there are some restrictions that require us to rethink how and when we can build shaders. VisionOS requires its applications to draw a frame every two seconds or else the application will be terminated; this makes sense in a local environment where users may have multiple applications running, but in gaming, it’s common to hide shaders while they’re running during sequence loading. With a two-second constraint, we couldn’t use the standard way to build a shader, so we had to develop a new way from scratch.

Our chief graphics engineer, Ben Hopkins, led us on our solution. To build the shaders correctly, we needed to have each unique vertex shape and shader variable generated once, off-screen, during the boot sequence- up. In order to do this, we created a simple tool that would collect and record the vertex designs from every mesh in the game. These logs would be fed into our warmup system, where players would experience one big shader warmup the first time they ran. Vacation Simulator. The sequence would create a quad for each vertex shape and cycle through our shader models for each of them. It’s admittedly a painful three to four minutes before it’s finished, so we tried to lighten the experience a bit with the best jokes the port team could write in an hour. one to keep the player busy. Once the shaders are built the game will start immediately.

IMG_0022.PNG

Making room

Daniel Perry, director of audio for Owlchemy Labs, explains how we were able to solve audio problems for our visionOS ports.

The biggest challenge we needed to solve with sound was that the Fully Immersive mode didn’t have access to the Apple Spatializer in Unity, and spatial sound is essential for our experiences to be able to reproduce the environment and creating a clear and responsive field. We needed to find a solution that would fit both designs Job Simulator and Vacation Simulator. Apple has a PHASE (Physical Audio Spatialization Engine) that works with Unity, but using it would require major changes to our audio stream, including routing, processing, and loading files.

Currently, the market is still low on local solutions for Unity, and most of the existing ones do not support VisionOS.

Resonance Audio spatializer is an open source and versatile platform but it has had low attention for some time, and was not compiled for VisionOS. Fortunately, because the source is available, we were able to modify it so that it can be built for VisionOS.

Because of Resonance’s limited way of routing, we had to create a custom solution for Verb. For performance reasons on mobile platforms, we always used simple algorithms with presets for different rooms and environments, and different groups of audio mixers to cover the effects in the game. While we couldn’t reproduce all the effects in the series of audio mixers, it was important to keep the spirit and feel of the world as a whole, so we created our own sending / receiving system solution to receive before the place that sends the sound. from all audio sources to general audio sources sent to the unshared “AudioMixer” interface.

While not the exact order of operations, it allowed us to use Resonance and still get the same abilities for teams after operations, and maintain overall consistency with our game across platforms. others, while maintaining improved audio performance. Resonance ended up being more compatible with the way our sound system is designed.

The end

When we first shipped Apple Vision Pro, we didn’t know if the issues holding us back would be resolved in one month or one year, but we knew we wanted to be there as soon as possible. possible. Apple shares our passion for hands-only tracking experiences, as we feel they are more approachable to a mainstream audience. Thanks to our ability to develop our own tools to solve some of our problems, our developers were able to launch Apple Vision Pro months before the VisionOS 2.0 update. We are proud of the work we have done to deliver Job Simulator and Vacation Simulator on VisionOS and we’re excited for new players to discover our award-winning titles.


#Deep #Dive #Owlchemy #adapted #titles #Apple #Vision #Pro

Leave a Reply

Your email address will not be published. Required fields are marked *