World AR Overview
This is the overview page for all World AR technology. We focus on tracking, localization, depth, and meshing technologies used mainly on the rear camera. Here we will share the latest updates for these technologies. Please email enersesian@snap.com for feedback, suggestions, and questions.
Recent Technology Releases
Instant World Hit Test
The Instant World Hit Test asset enables snapchatters to start interacting with world-facing Lenses instantly. Currently there needs to be a UI prompt for the user to look around for a few seconds before generating the World Mesh and causes a decrease in user engagement.
With this asset library Custom Component, the Lens Creator can start hit testing right on the depth texture and ask for callbacks when a World Mesh hit test is found that is more accurate. This will increase user engagement for the use cases of AR object placement and projectile shooting.
Notice that the user can shoot candy canes instantly with this Lens. The blue candy canes were shot using a hit result from the depth texture since the World Mesh haven't generated in that area yet. Notice that the World Mesh does eventually generate in that area, and the candy canes were upgraded from blue to red to indicate that they have their position and orientation upgraded from a depth texture hit result to a World Mesh hit result.
World Mesh 2.0
In 2021, Snap AR released the first version of World Mesh in Lens Studio for all devices. However, the technology was limited in its accuracy and precision. As part of Lens Studio version 4.55, Snap has released World Mesh 2.0; an update that represents a significant improvement in the accuracy, support, and precision of reconstructed meshes. Our novel approach to increasing mesh accuracy flips the script on why non-LiDAR world mesh wasn’t a developer’s first choice. In doing so, we also increased the number of supported phones with access to high-quality, 3D meshing in real time.
As shown in the visuals below, the previous technology provided a much coarser and far less precise scans that ended up more garbled and messy. Now, with the 2.0 technology, the result is much finer, more accurate, and more realistic.
World Mesh 1.0 | World Mesh 2.0 |
---|---|
![]() | ![]() |
With World Mesh 2.0, Developers using Custom Location Creator Lens can take advantage of new features such as:
- Higher quality Meshes.
- Location Meshes are now in color.
- The Lens no longer requires a LiDAR powered phone, meaning iPhones 6s and up are supported, along with high end Android devices.
World Tracking Technologies
Device Tracking
World AR technology offers several tracking solutions through the use of a device tracking component placed on a scene object that also has a camera component. The most relevant and modern tracking solution is to use a device tracking component set to World tracking mode. This will enable the usage of additional World AR technologies like instant world hit test, World Mesh, and depth texture.
Instant World Hit Test
The Instant World Hit Test asset enables snapchatters to start interacting with world-facing Lenses instantly. Currently there needs to be a UI prompt for the user to look around for a few seconds before generating the World Mesh and causes a decrease in user engagement.
With this asset library Custom Component, the Lens Creator can start hit testing right on the depth texture and ask for callbacks when a World Mesh hit test is found that is more accurate. This will increase user engagement for the use cases of AR object placement and projectile shooting.
Notice that the user can shoot candy canes instantly with this Lens. The blue candy canes were shot using a hit result from the depth texture since the World Mesh haven't generated in that area yet. Notice that the World Mesh does eventually generate in that area, and the candy canes were upgraded from blue to red to indicate that they have their position and orientation upgraded from a depth texture hit result to a World Mesh hit result.
Depth Texture TBD Show off the shader effects and templates
Documentation page is currently under development. It will explain Depth Texture as a foundational tech to World Mesh, and explore some of its use cases like occulsion and colored filters.
World Mesh
World Mesh provides a real-time 3D reconstruction of the real world based on what your device sees. This allows your experience to better match and respond to the environment that it’s in.
Location AR Technologies
Custom Location Creator Lens
The Custom Location Creator Lens allows you to use your mobile phone to create Custom Locations out of different locations and landmarks around the world.
Scan this QR code within the Snapchat app to get access to the Custom Location Creator Lens:
With World Mesh 2.0, Developers using Custom Location Creator Lens can take advantage of new features such as:
- Higher quality Meshes.
- Location Meshes are now in color.
- The Lens no longer requires a LiDAR powered phone, meaning iPhones 6s and up are supported, along with high end Android devices.
Custom Locations
With Custom Location AR, you can map any location of your choosing, upload it to Lens Studio, and author AR content at that specific location. The template provided will guide you through the steps needed to create your own Custom Location AR Lenses in a quick and easy process.
City-Scale
The City-Scale AR Template allows you to create unique Lens experience specific to select regions of certain cities around the world. The template provides a starter setup for City-Scale AR in addition to three examples City Scale AR Lenses.



Currently, Snap supports London, Downtown Los Angeles and Santa Monica.
Landmarkers
The Landmarker Template allows you to create unique Lens experiences at selected locations around the world. Similar to Marker Tracking, Landmarker Tracking tracks the specified physical location's landmark. The template shows you how to utilize Landmarker tracking to create a 3D experience tightly tracked to the location's architecture. It also includes debug features that allow you to test your Lenses without actually being at the location.
Additional Tracking Technologies
Marker Tracking
The Marker Tracking
Component allows you to track content to images in physical space. An ideal use case for Marker Tracking is a Lens that overlays content on a business card, poster or mural.
To learn how you can add Marker Tracking to a Lens, visit the Marker Tracking guide. You can also try out the Marker Template and Marker with Snapcode Template to get started with Marker Tracking right away.
Object Tracking
Object Tracking allows you to attach images and animations to certain objects found in the scene. Object Tracking currently supports the detection and tracking of a Cat, Dog, Cat and Dog, Hand and Body. Each Object Tracking type offers various Attachment Points to attach content to.
To learn how you can add Object Tracking to a Lens, visit the Object Tracking guide. You can also try out the Pet Template, Hand Template, Shoulder Template, Skeletal Template, Full Body Triggers, and Full Body Attachments to get started with Object Tracking right away.
Object Tracking 3D
Object Tracking 3D is similar to Object Tracking, except that the position provided is in the 3D scene, rather than 2D screen space. In addition to tracking objects to the real world, you can also use the data provided by this tracker to create custom interactivity.
Attach to Camera
You can attach an object to the camera. This is great for simulating a first person object (For example: a magic wand pretending to be in the user's hand). To do this, in the Objects
panel, drag the object you want to be attached to the camera to be a child of the Camera
object.