At Google I/O 2021, Google is announcing some important upgrades for ARCore, the company’s augmented reality platform that powers more than 850 million Android smartphones worldwide. Unlike Project Tango, which required specialized hardware, ARCore relies on your phone’s existing hardware and sensors to collect data on depth, motion tracking, and light estimation to help developers build interactive AR experiences.

Since its launch, Google has steadily improved the feature set and capabilities of ARCore, pushing the limits of what developers of AR apps can accomplish with the existing hardware of Android smartphones. Last year, Google released the ARCore Depth API to allow developers to generate a depth map using just a single RGB camera and create more realistic AR experiences. Today, the company is adding two new tools to ARCore’s arsenal: Raw Depth API and Recording and Playback API.

ARCore Raw Depth API

The new Raw Depth API builds upon the Depth API to provide more detailed representations of surrounding objects by generating raw depth maps with corresponding confidence images. While the Depth API focused on generating smoothed depth maps with depth estimates for all pixels, the Raw Depth API aims to capture more realistic depth maps with confidence images providing a per-pixel depth estimate.

ARCore Raw Depth API illustration

Another area of improvement is the hit-test which now uses depth maps instead of planes to provide more hit-test results even on non-planar and low-texture floors. TeamViewer’s LifeAR app has utilized the depth hit-test to integrate AR capabilities into video calls.

These new improvements don’t require specialized hardware like time-of-flight (ToF) sensors and thus can be implemented on a vast majority of ARCore certified devices. The Raw Depth API and the depth hit-test are available to developers starting today.

ARCore Recording and Playback API

Alongside the new Raw Depth API, Google is also debuting a new Recording and Playback API in ARCore that gives app developers greater flexibility for testing out different AR experiences. When building new AR experiences, developers often have to continuously test in specific environments and places. With the new Recording and Playback API, developers can now record video footage with AR metadata such as depth and IMU motion sensor data and use it to recreate the same environment for further testing. The idea here is to record the footage once and use it as a template to test out other AR effects and experiences instead of shooting fresh footage each time. Ride-hailing app Didi-Rider used the API to build and test AR-powered directions in its app and was able to save 25% on R&D and accelerated its development cycle by six months.

For end-users, the Recording and Playback API also enables new AR experiences such as post-capture AR. This lets users feed a previously recorded video to AR apps, removing the need to be physically there at the location and doing a live camera session. In other words, users can shoot footage once and add AR effects later. The ARCore Recording and Playback API is available to developers starting today, and you can read more about it on this page.