Optimizing 3D Models for ARCore (Update 2024)
Table of Contents
ARCore is Google’s platform for building augmented reality experiences. Using different APIs, ARCore enables your phone to sense its environment, understand the world and interact with information. Some of the APIs are available across Android and iOS to enable shared AR experiences.
This guide outlines key considerations and steps for ensuring optimal performance of 3D assets, including polygon count, file size, texture resolution, and draw calls, using RapidPipeline to streamline the optimization process.
What is ARCore?
ARCore is a software development kit (SDK) developed by Google that allows developers to create augmented reality (AR) applications for Android devices. It provides APIs for tasks such as motion tracking, environmental understanding, and light estimation, which are essential for building AR experiences.
- Motion tracking: ARCore tracks the device's position and orientation in real-time, allowing virtual objects to stay anchored to the real world as the device moves.
- Environmental understanding: It detects and understands the environment by recognizing surfaces like floors and tables, enabling virtual objects to interact realistically with the physical world.
- Light estimation: ARCore estimates the lighting conditions of the environment, allowing virtual objects to cast shadows and blend more naturally with their surroundings.
- Plane detection: It identifies flat surfaces in the environment, such as floors or tables, where virtual objects can be placed.
- Point cloud generation: ARCore generates a 3D point cloud of the environment, which helps in accurately placing virtual objects and understanding the spatial layout.
- Anchors: Virtual objects can be anchored to specific points or surfaces in the real world, ensuring they remain fixed in place even as the user moves around.
- Augmented images: ARCore can recognize and track images in the environment, allowing developers to create AR experiences triggered by specific images.
- Cloud Anchors: It enables shared AR experiences by allowing multiple users to interact with the same virtual objects in the same physical space.
- Depth API: ARCore can estimate depth in the scene, enabling more immersive AR experiences and better interaction with the environment.
- to AR development or for rapidly prototyping AR concepts.
What are suggested 3D requirements for ARCore applications?
File format support | glTF 2.0/glb, using these extensions:
|
Animation |
The animation will be played on a loop. If the glTF file contains multiple animations, Scene Viewer plays only the first animation. |
Recommended limits | The overall performance of assets depends on setting constraints and making tradeoffs between vertices, materials, texture resolution, mesh per material, and other factors. Use the following guidelines to optimize your assets.
|
Shadow support | Hard shadows are automatically rendered by Scene Viewer when placing an object, so we recommend against baking shadows into your model. |
Texture support |
|
Material | PBR |
File loading | HTTPS |
Scene |
|
How to automatically optimize any asset to .glb for ARCore?
- Upload your 3D Asset to RapidPipeline
- Choose the 3D Processor preset "Single-Item Mobile MidRes" and adjust if needed
- Download your optimized model
- Open the file in AR model
Meet the Author
DGG Team
The 3D Pipeline Company
DGG is on a mission to connect the real and virtual by making 3D models as easy to handle as 2D images.