documentation

Tracking States in Unity
Level: Intermediate

Introduction

VisionLib tracking comes with different tracking states:

state

description

"tracked" The tracking target is successfully recognized in the camera image.
"critical" The tracking target was tracked, but something disturbs tracking (e.g. motion blur or occlusion). If the tracking stays critical for too long, the state will change to "lost".
"lost" The tracking target isn't recognized in the camera image (anymore).

In many applications, users want to hide and show different contents depending on the current state of VisionLib's tracking. To achieve this behaviour for Renderer, we provide a specific RenderedObject component (for more details on this component consult Using Different Augmentation and Init Pose Guide).

If you want to add a custom behaviour on tracking state changes for your application, you can choose between two options:

  • Invoke a simple function via the public inspector interface of the TrackingAnchor or
  • Listen to the appropriate events from the TrackingManager using your own scripts.

Using Unity Events in the Inspector

The TrackingAnchor invokes Unity events if the state of its tracker changes. These events can be accessed in the Tracking Events section of the TrackingAnchor or via script.

TrackingAnchorTrackingEvents.png

Listening to the Tracking States Event in Script

If you need to call a more specific behaviour, you can also access the tracking states within your script by listening to VisionLib's tracking states event. This event is accessible via the TrackingManager and provides a TrackingState. The TrackingState contains an objects array, which contains an Anchor for every TrackingAnchor in the scene (see TrackingState Class Reference). An Anchor references the corresponding TrackingAnchor by the name. The state string will contain the current state of this TrackingAnchor.

To write your own tracking state handling, you need to register to the OnTrackingStates event of the TrackingManager. Once you are successfully listening to tracking state changes, you can add your custom implementation within the newly defined HandleTrackingStates function. In the following example this is used to call DoSomething when all TrackingAnchors are in the same tracking state.

...
void OnEnable()
{
// Register a custom function to the tracking states
TrackingManager.OnTrackingStates += HandleTrackingStates;
}
void OnDisable()
{
// Unregister from event
TrackingManager.OnTrackingStates -= HandleTrackingStates;
}
private void HandleTrackingState(TrackingState states)
{
if (states.objects.Length == 0)
{
return;
}
var firstState = states.objects[0].state;
bool allStatesEqual = states.objects.All(state => state.state == firstState);
if (allStatesEqual)
{
DoSomething(firstState);
}
}
...

Tracking States with External SLAM (ARKit/ARCore)

You should typically listen to the "tracked" state to generate a stable superimposition. Anyway, we recommend listening to "critical", when using external SLAM poses with VisionLib. This is because you may want to continue to track and show your content, although the (original) model tracking target is not at display in the video image anymore. For example: you have a physical tracking target that you want to track and initialize with VisionLib's model tracking, but you also want to look around and move sideways in your virtual scene with the help of SLAM tracking. Combining both, VisionLib and ARKit/ARCore, makes this possible and the "critical" state enables you to have your superimposition visible until both, VisionLib and SLAM tracking, are truly "lost".

Distinguish the Critical Tracking State

The TrackingState contains a state and a quality value. If you want to distinguish the "critical" tracking state further, you need to check the combination of those two values, like seen below:

Tracking StateTracking QualityDescription
lost> 0The object has not be found and the tracker is searching for a pose
tracked>= minTrackingQualityThe tracker is successfully tracking the object
critical> 0The tracker is tracking the object only using the SLAM pose, the object is assumed to be in the field of view
critical0The tracker is tracking the object only using the SLAM pose, the object is assumed NOT to be in the field of view

Find more details on VisionLib with External SLAM here: External SLAM (ARKit/ARCore) Support.