Skip to main content
Version: 5.x
Supported on
Snapchat
Spectacles
This feature may have limited compatibility and may not perform optimally.
Camera Kit

Environment Matching

Lens Studio comes with Several techniques for adding realism to your AR objects.

Take a look at the ML Environment Matching Template for an example of the techniques covered here.

Dynamic Environment Map

The Dynamic Environment map generates an environment map in real time from some input–usually the Device Camera Input. This Dynamic Environment map will allow your object to receive lighting from the real world. When available, this feature will automatically leverage a Machine Learning model to improve the results of the generated environment map. When not available, the system will automatically fallback to just using the inputted texture.

With Dynamic EnvmapWithout Dynamic Envmap
Gif example of Dynamic EnvmapGif example of Envmap

The Machine Learning generated environment map is only available in the front camera.

Add Dynamic Environment Map

In the Scene Hierarchy panel, select your Light object. In the drop down, you can select Environment Map. In the Input field, select your Device Camera Texture.

Gif of Dynamic Environment Map in Scene Hierarchy panel

Now, all your PBR material will use the generated environment map to light itself!

Try using the Webcam option in the Preview panel and shining a light on yourself to see the effect in action!

The ML generated environment map may be blurred such that it won’t allow your object to have a mirror effect. However, you can modulate the result with a simple reflection based on a texture. Take a look at the ML Environment Matching template to learn more about this technique.

Was this page helpful?
Yes
No