Skip to main content
Version: 5.x

World Mesh and Depth Texture

World Mesh provides a real-time 3D reconstruction of the real world based on what your device sees. This allows your experience to better match and respond to the environment that it’s in.

In 2021, Snap AR released the first version of World Mesh in Lens Studio for all devices. However, the technology was limited in its accuracy and precision. As part of Lens Studio version 4.55, Snap has released World Mesh 2.0; an update that represents a significant improvement in the accuracy, support, and precision of reconstructed meshes. Our novel approach to increasing mesh accuracy flips the script on why non-LiDAR world mesh wasn’t a developer’s first choice. In doing so, we also increased the number of supported phones with access to high-quality, 3D meshing in real time.

World Mesh is a beta feature and is not available on all devices. Its capabilities vary depending on the device it is running on. However, you can start developing today as your Lens will automatically be updated to run on more devices and with the latest World Mesh as it arrives.

World Mesh Capabilities

When a device provides a World Mesh, you have access to the following:

  • World Mesh: An automatically generated mesh that represents the real world in 3D. You can use this to occlude AR effects based on the real world. Use it as you would any other mesh.
  • Hit Test Results: Using the ray casting or hit test function, you can learn about the mesh that was hit at a certain point, as well as its position, normal. Learn more about this in the API documentation or in the scripting section below.
  • Depth Texture: You can get a texture representing the depth of the world in the camera view.

When using World Mesh, your Camera object should contain a Device Tracking component with the World option such that it can accurately map the generated mesh to the real world.

Availability

World Mesh is available on devices with:

  • Devices with LiDAR
  • Recent devices with ARKit and ARCore
  • Spectacles

However, there are some differences to note:

  • On non-LiDAR devices and Spectacles, the World Mesh will change over time as the Lens continues to refine its understanding of the surfaces it sees.
  • When LiDAR is available, doing a Hit Test will provide you with semantic information about the surface hit. Types provided are: wall, floor, ceiling, table, seat, window, door and none.

On slower devices, the generation of the World Mesh will be slower. This means that it may take longer for the Lens to receive a mesh. Older devices without support for OpenGL ES 3.0 will also not work.

In general, LiDAR will work better and be more accurate given the dedicated sensor. For example: World Mesh currently works better indoors than outdoors on devices without LiDAR. In addition, Depth Texture data will be more accurate on devices with LiDAR.

World Mesh in Action

The following are some examples of World Mesh in action.

Using World Mesh and Depth Textures

There are a variety of ways you can integrate geometric understanding of the world into your Lens!

Adding a World Mesh

You can add a World Mesh by pressing + > World Mesh in the Asset Browser panel

You can treat this mesh as you would with any other mesh. For example, you can display the World Mesh:

You can quickly display a mesh by adding it from the Scene Hierarchy panel > + > World Mesh

Once a World Mesh is added, you can modify its settings via its API, or through the World Mesh resource in the Asset Browser panel.

  • Use Normals: whether surface normal information should be baked into the mesh.
  • Classification (for devices with LiDAR): whether classification should be baked into the mesh. Note that in Lens Studio classification data will always be provided.

Adding a Depth Texture

In some cases, you may only need depth data, rather than a constructed mesh, such as to do a depth of field effect.

You can access the Depth Texture by going into the Asset Browser panel > + > Depth Texture

You can use this depth texture as an input into the Material Editor to create a shader based depth effect.

Like World Mesh, Depth Texture can come from a variety of sources. For example, you may get depth data from a device with depth sensors in the front camera, through Spectacles' dual cameras, or through AR Core’s depth data. When World Mesh is used, depth data will be provided by World Mesh.

Scripting and World Mesh

With scripting, you can get additional information about the World Mesh so that your Lens can respond to surfaces that it sees.

For example the script below would send a ray from the center of the screen into the world until it hits a mesh. Then, it would print out information about the vertex of the mesh at that location.

// @input Component.DeviceTracking tracker
// @input Component.RenderMeshVisual worldMesh
// Screen position we want to do a hit test on
var centerOfScreenPos = new vec2(0.5, 0.5);
// Check if World Mesh is available
if (script.tracker.worldTrackingCapabilities.sceneReconstructionSupported) {
// Get the world mesh API
var worldTrackingProvider = script.worldMesh.mesh.control;
// Set up world mesh to check classification
// Note: Only available on devices with LiDAR
worldTrackingProvider.meshClassificationFormat =
MeshClassificationFormat.PerVertexFast;
worldTrackingProvider.useNormals = true;
// Do a hit test
var hitTestResults = script.tracker.hitTestWorldMesh(centerOfScreenPos);

hitTestResults.forEach(function (hitTestResult) {
// Get vertex information
var position = hitTestResult.position;
var normal = hitTestResult.normal;

// Classification possibilities at:
// /api/classes/TrackedMeshFaceClassification/
var classification = hitTestResult.classification;
// Do something with data
print(
'Hit mesh at ' +
position +
' with normal of ' +
normal +
' and classification of: ' +
classification
);
});
}

Try using this data to modify the position and rotation of an object so that it is always facing up from the surface!

If you are building for Spectacles, consider the WorldQueryModule which allows you to raycast to the world without waiting for World Mesh, and reduce performance cost.

Passing World Mesh Data

In addition to using the World Mesh itself, you can use the World Mesh as a source of data. For example, you can use it to understand normals and surfaces in the world such that VFXs can collide with the real world!

Take a look at the Simple World Mesh template to see various examples of this in action!

Was this page helpful?
Yes
No

AI-Powered Search