When using Model Tracking, showing the line model or giving other visual guidance is a good practice to help users find entry into your AR app. Basically, this is the matching of the camera to the init pose to get tracking started.
To do so, as a developer, you can create custom graphics, set e.g. the "showLineModel":"true",
parameter in the tracking configuration, or present the superimposition visuals already at startup.
Especially the latter is often not desired. In contrary, you only want to show your augmentation visuals once you are successfully tracking your target. The same is true for visual guidance: it's nice to have a line model rendered to help users around, but only if tracking hasn't started yet or got lost.
In case of a visual guidance, there is a parameter you can use in your tracking configuration and even further parametrization, like the line model color:
If you want to add a custom behaviour on tracking state changes for your application, you can choose between two options:
TrackingStateProvider
orTrackingManager
using your own scripts.VisionLib comes with the TrackingStateProvider
(You can see it in use within the PosterTracking
example scene of the VisionLib.SDK.Examples-Unity.unitypackage
). You can simply drag it on a GameObject, to react to tracking state changes without the need to write a custom script.
Now you can see a list of empty events and add your desired behaviour as a reply to these events.
For example this could be the activation of a GameObject as soon as it's tracked and its deactivation when tracking is lost:
+
under the Tracked()
event of the TrackingStateProvider
.None(Object)
.No Function
to GameObject/ SetActive(bool)
.Tracking Lost()
event, but uncheck the box.Now the component should look like this:
You can also invoke a custom function from your script that way, as long as it's public and takes no parameters or only basic ones like string, bool or int. Just drag the GameObject that has your script attached in the event field and choose your desired function.
The TrackingStateProvider
component gives the opportunity to react to the following events:
Event | Description |
---|---|
Tracked() | Event fired once after the tracking state changed to "tracked". |
Tracking Critical() | Event fired once after the tracking state changed to "critical". |
Tracking Lost() | Event fired once after the tracking state changed to "lost". |
If you need to call a more specific behaviour, you can also access the tracking states within your script by listening to VisionLib's tracking states event.
First, we need to register to the OnTrackingStates
event of the TrackingManager
(which stores the TrackingState
in the state
object of TrackingState.TrackingObject
, see TrackingState Class Reference) :
Once you are successfully listening to tracking state changes, you can add your custom implementation within the newly defined HandleTrackingStates
function. There are several states available which are listed in the table below.
As an example, we want to toggle the augmentation's visibility and hide it, whenever tracking is lost.
At the moment there are three states to listen to, which are:
state | description |
---|---|
"tracked" | Tracking target is tracked successfully. |
"critical" | Tracking target was tracked, but something disturbs tracking (e.g. motion blur or occlusion), making tracking unstable. If the tracking stays critical for too long, then the state might change to "lost". |
"lost" | Tracking target was lost and could not be tracked anymore. |
You should typically listen to the "tracked"
state to generate a stable superimposition. Anyway, we recommend listening to "critical"
, when using external SLAM poses with VisionLib. This is because you may want to continue to track and show your content, although the (original) model tracking target is not at display in the video image anymore. For example: you have a physical tracking target that you want to track and initialize with VisionLib's Model Tracking, but you also want to look around and move sideways in your virtual scene with the help of SLAM tracking. Combining both, VisionLib and ARKit/ARCore, makes this possible and the "critical"
state enables you to have your superimposition visible until both, VisionLib and SLAM tracking, are truly "lost"
.
The TrackingState
contains a state
and a quality
value. If you want to distinguish the "critical" tracking state further, you need to check the combination of those two values, like seen below:
Tracking State | Tracking Quality | Description |
---|---|---|
lost | > 0 | The object has not be found and the tracker is searching for a pose |
tracked | > trackingInlierRatio | The tracker is successfully tracking the object |
critical | > 0 | The tracker is tracking the object only using the SLAM pose, the object is assumed to be in the field of view |
critical | 0 | The tracker is tracking the object only using the SLAM pose, the object is assumed NOT to be in the field of view |
Find more details on VisionLib with External SLAM here: External SLAM (ARKit/ARCore) Support.