Hand Gestures
This guide shows you how to use the hand gesture recognition feature. The hand gesture feature uses the Object Tracking component for gesture recognition and for emitting events that can be used to perform any kind of actions.
Take a look at the Hand Gestures template to see an example of what you can do with Hand Gestures
Overview
First, let’s take a brief look at what we are going to make. The core component of this tutorial is the Object Tracking component. It is used to recognize hand gestures and emit events when a hand gesture is detected.
Object Tracking Component
Now we are ready to create the Object Tracking component. To create it, press the “+”
button in the Scene Hierarchy
panel and select Object Tracking > Hand Tracking
. This will create a set of nested objects. Select the Orthographic Camera > Hand Tracking > Center
in the Scene Hierarchy
panel and you can see the Object Tracking
component on it in the Inspector
panel.
Keep in mind that an object with attached Object Tracking component will be disabled if the component is not detecting an object it is looking for. So if you attach other scripts to the same scene object they will not execute.
Since this uses the Object Tracking component, you have access to the same APIs as well, like onObjectFound
and onObjectLost
.
Gesture Tracking
Next step is adding a script that can emit events when the Object Tracking component detects a hand gesture. To do this, create an empty Scene Object, name it Gesture Tracker (Scene Hierarchy panel > + Button > Empty Object
). Then to add a script to it, in the Inspector panel, press Add Component > Script
. Then press the Add Script
button. In the pop up panel, press + > Script
. This will create a new Script resource for you to edit.
Accessing the Object Tracker
Double click the new script in the Asset Browser
panel to open the script in the Script Editor. Next we will add an input which will give us access to the Object Tracking component.
//@input Component.ObjectTracking tracker
Save the script and assign just the created Object Tracking component to the Tracker input field.
Take a look at Scripting Overview for more information on working with scripting inputs
Listening to gesture event
Using this tracker object we can register a listener that will be listening to a given hand gesture detection and call a function once the hand gesture is found. You can do so by calling registerDescriptorStart
method of the Object Tracking component and pass a gesture name and a callback function to call when the gesture is found.
There are five hand gestures that we can recognize and their code names are:
victory
-
open
index_finger
horns
close
thumb
Now, having this information, let’s make that listener. First we have to make the callback function. Add these lines to the script attached to Gesture Tracker:
function triggerResponse() {
print('Open Hand Gesture Detected');
}
script.tracker.registerDescriptorStart('open', triggerResponse);
And that is it! Now, every time an open hand is detected you can see a message in the Logger
Take a look at the API page for more information on the registerDescriptorStart method
Using Behavior Script
You can also respond to Hand Gestures using the Behavior helper script.
Create the Behavior script by clicking the “+”
button in the Scene Hierarchy
panel followed by Helper Scripts > Behavior
. Next, change the Trigger
parameter of the Behavior script to Object Tracking
, then, you can set Event Type
dropdown to Descriptor Start
. Like before you can fill it with the keywords listed above.
Next, we will set the Respond Type
for the Behavior script to respond to our gesture! For example, we can enable or disable an object when a gesture is detected by using the Set Enabled
response. Take a look at the behavior guide to learn more about what you can do with Behavior.