The S20 Ultra is the non plus ultra (pun intended) of the current Samsung lineup. As one of the highest quality manufacturer of Android devices, the AR capabilities we are about to test are the high end available on the market to date.

Warning: this is not a generic device test. This article focuses on the newest flagship of Samsung, the S20 Ultra’s AR capabilities.

What do you do when you are offered to test such a device the next day for a few hours?

Photo by Jonas Leupe on Unsplash

We accepted immediately, without hesitation, regardless it came completely out of the blue and our platform is in alpha testing phase. Good chance to validate AR-ON can be even used for stress testing hardware, without any mayor modifications.

So, let’s see what is the S20 Ultra capable of!

Tracking

The most fundamental part of any immersive AR experience is tracking. AR content floating away immediately breaks the magic.

The S20 Ultra did an excellent job in far from optimal lighting conditions. We did this test during night time, in living room with kitchen decently lit with five 7W spotlight LEDs, two pointed at the table. This was enough to keep the content in position, even when stress tested with all models loaded (more on that later), real-time reflections and screen recording also turned on to make the frame rate drop.

To push it’s boundaries, we have turned off the lights and only the countertop lights were kept on, to have at least some illumination. To our biggest surprise, the S20 Ultra kept tracking. The only way to confuse it was to start shaking it in this minimal lighting. Then it lost tracking.

If you don’t want to experience AR in the dark while shaking your phone, it will most probably just keep tracking.

Real-time reflections

We have an unoptimized (so far) trick in our latest version that gives the feeling of the augmented content reflecting the physical environment. Unoptimized, because it takes 2 steps of rendering instead of one, straining the hardware. This is usually undesired. However, comes pretty handy when you are trying to find out where the bleeding edge tech starts to actually bleed. The frame rate achieved by the S20 Ultra is not completely smooth – as expected – still, it kept the reflections about 10 fps. With the app itself running normally (over 30 fps), the result is not bad at all.

Screen recording

Any AR footage you presented here is recorded by the S20 Ultra in 720×1440. Unoptimized screen recording combined with pushing the loaded model size (in terms of megabytes) can easily result in memory leak, thus quitting the application. So this is exactly what we did.

We have started recording, loaded everything we had, and waited… and waited… and waited. The 10 GB of RAM proved to be more than enough. Eventually we gave up on pushing it to memory leak.

Triangles

Every 3D model is rendered by it’s surface represented by triangles. The more detailed the model, the higher triangle count is, thus the higher GPU performance and amount of memory are needed. More and larger models loaded means higher stress on the hardware. For this very reason we have incrementally loaded everything we got in stock. By the end, we have loaded four car models, summing up to about 4 million triangles rendered. The frame rate obviously dropped to terrible 3-5 fps, however, the tracking kept to be stable. In our testing app there was no option to duplicate models, and the strict time window for testing didn’t allow changing it, but we were not far away from the max capacity of the hardware. Maybe next time we can push it to quit.

VFX
We built our newest platform around the latest developments, that enables the VFX particle count to be increased to a level that was previously impossible. From the few hundred or maybe few thousands now to a few hundred thousands. We didn’t have yet any sample assets demonstrating this feature on our sleeve, so we used one scene from Dilmer Valecillos publicly available on GitHub, and pushed it a bit. We increased the capacity of the VFX particle system by a magnitude, and also doubled the whole VFX object, just for fun. The results is more than 5 million particles rendered realtime. How did the S20 Ultra handle that? Without any issues.

As a benchmark, we also shot a video on how an older S8 deals with the same task. The frame rate is terrible, and it can not render both of the VFX objects at the same frame.

Summary

The Samsung Galaxy S20 Ultra delivers decent augmented reality experience in any reasonable scenario. We could make it sweat, but – to be honest – it was mutual. The brutal cameras with the depth sensor are opening up new possibilities for effects like real-world occlusions to achieve even more realistic AR experiences. The 5G capability will enable developers and users to leverage the power of cloud rendering. The high performance CPU and GPU and 12GB RAM combined with these cameras provides everything AR developers/users need today. It is also geared up for the next few years when we are about see the rise of 5G networks and applications.