documentation

Poster Tracker

Tracking configuration parameters for setting up a poster tracker.

The "posterTracker" uses feature-based tracking with a reference image for determining the camera pose.

On HoloLens, the poster tracker works in meters, so setting the realWidth of the poster leads to interpretation in meters. In order to stay compatible with your units, you may set the metric to m.

Configuration File Parameters

The following parameters can be set inside the tracking configuration file:

Parameter Type Default value Example
imageURI string mandatory to set "imageURI": "project_dir:myimage.png"
"imageURI": "http://myserver.com/myimage.png"

This string defines a URI to the reference image. Supported are PNG and JPEG images. If the image file is in the same directory as the configuration file, it can be set using for instance "imageURI": "project_dir:myimage.png".

The usage of http URIs is currently not possible on UWP, which includes HoloLens. If you require this feature, please contact us.

downSample bool false "downSample": true

If this flag is set, the tracking of the reference image is performed only on the half image resolution. This will lead to less computational effort and faster processing, but the results might be less precise.

metric string mm NO "metric": "m"

Only interpreted on iOS with extendibleTracking set to true. This should be set to the corresponding unit size of your poster. Valid values are metric scales ("mm", "cm", "dm", "m" or "km") and imperial scales ("in", "ft", "yd", "ch", "fur", "ml").

realWidth float 1 "realWidth": 0.269

This value defines the real world width of the reference image in the unit measure of the virtual scene. If you have an A4 printed image you want to track - we recommend setting a value in meters in order to stay compatible with the HoloLens poster tracker.

transform json struct identity transformation "transform": { "t": [ 0.0, 0.0, 0.0], "r": [ 1.5707963, 0.0, 0.0 ] }
If this transformation is not set, the origin is located in the center of the reference image. The reference image lies in the xy-plane, with z pointing away from the camera. If this is not desired, the parameter can be used to transform the coordinate system of the reference image. The vector t is the translation in mm and the vector r is an axis-angle representation of the rotation. The example above represents a rotation of 90 degrees around the x-axis, leading to the reference image lying in the XZ-plane, with Y pointing upwards.
maxFramesFeaturePrediction int 15 "maxFramesFeaturePrediction": 20
If the reference poster cannot be tracked safely, the camera pose is predicted without a strong validation. This value defines the number of frames of this prediction step. If the number of frames is reached, the re-initialization is executed. The prediction is only used for non-extendible tracking.
extendibleTracking bool false "extendibleTracking": true
Turns the extendible tracking on or off. If extendible tracking is turned on, features in the surrounding of the reference image are additionally reconstructed and tracked. Which makes it possible that the camera can be tracked, even if the reference image is not visible any more. The user needs to perform a SLAM dance, which means to translate and rotate the camera, so that there is enough baseline for the feature reconstruction.
minCornerness int 25 "minCornerness": 25
Minimum cornerness for the feature detector of the SLAM module. This parameter is only used when extendibleTracking is set to true.
minFeatureDistance int 10 "minFeatureDistance": 18
Minimum distance of neighboring features for the feature detector of the SLAM module. With this parameter the number of used features can be controlled. This parameter is only used when extendibleTracking is set to true.
minTriangulationAngle float 5.0 "minTriangulationAngle": 4
Minimum angle in degree for the triangulation of feature points for feature reconstruction. This parameter is only used when extendibleTracking is set to true.
debugLevel int 0 "debugLevel": 2
This value specifies the amount of visual output for debugging purposes. Debug level 1 produces some images for visualizing several debug information. Debug level 0 generates no debug information at all. This mode is faster and should be use for a final release. Debug level 2 is a mode for internal debugging purposes. Only use this if you know what you want to do with it. Enabling this feature can significantly harm the performance of the tracking pipeline.
synchronous boolean false "synchronous": true
This parameter exists ONLY FOR TESTING PURPOSES. Don't set it to true unless you really know what your are doing! Usually the tracking utilizes multiple threads. If this parameter is set to true the whole tracking process will run in only one thread. This will reduce the performance, but in combination with the synchronous worker interface it allows you to get deterministic tracking results. This is useful if you want to get the same tracking results for a sequence of images independent from the current processor and GPU utilization.

Example configuration file

{
"type": "VisionLibTrackerConfig",
"version": 1,
"meta": {
"name": "PosterTrackerLeaves",
"description": "Tracker for a reference image",
"author": "VisionLib"
},
"tracker": {
"type": "posterTracker",
"version": 1,
"parameters": {
"imageURI": "project_dir:leaves.png",
"downSample": false,
"realWidth": 269.0, // with in mm
"transform": {
"t": [ 0.0, 0.0, 0.0],
"r": [ 1.5707963, 0.0, 0.0 ] // rotation of 90 deg around x --> groundplane in XZ
},
"maxFramesFeaturePrediction": 20, // only used for non-extendible-tracking
"extendibleTracking": true,
"minCornerness": 15, // for feature detector
"minFeatureDistance": 18, // for feature detector
"minTriangulationAngle": 4, // for feature reconstruction
"debugLevel": 1
}
},
"input": {
"useImageSource": "cameraWindowsLifeCam",
"imageSources": [{
"name": "video0",
"type": "video",
"data": {
"uri": "project_dir:Videos/video0.avi",
"scale": 0.5
}
}, {
"name": "imageSequence0",
"type": "imageSequence",
"data": {
"uri": "project_dir:seq/*.png",
"device": "WindowsLifeCam"
}
}, {
"name": "camera0",
"type": "camera",
"data": {
"unit": 0,
"undistort": false
}
}, {
"name": "cameraWindowsLifeCam",
"type": "camera",
"data": {
"unit": 0,
"device": "WindowsLifeCam"
}
}]
}
}