UnitySDK Quick Start

Level: Basic

Setting up a quick AR Demo using the 3D Model Tracker in Unity

In this quick guide, we demonstrate how to get started with the VisionLib SDK for Unity. We provide a broad overview on the Unity SDK, the workflow with VisionLib, and the structure of tracking configuration file.


Before you start, be sure to have the following installed or at hand:

  • Unity 2019.4 LTS or higher (we recommend using 2019.4 LTS)
  • a webcam attached for testing the application on your desktop PC
  • the latest VisionLib.SDK-Unity package, downloaded from the customer area
  • a valid license file, downloaded from the customer area
  • our paper crafted mini car 3D model resource, which you can download here. Print the PDF and follow the instructions to create the paper-crafted test target.

Unity Example Project

  • If you want to use an external webcam, be sure to connect it before you start Unity
  • Start Unity and create a new project
  • Import the VisionLib.SDK-Unity.tgz via the package manager and add the VisionLib.SDK.Examples-Unity.unitypackage via Assets > Import Package > Custom Package
  • Browse to VisionLib Examples/ModelTracking/Simple in the project navigator and open SimpleModelTracking.unity

License File

  • Copy your license file into the StreamingAssets/VisionLib folder of your Unity project

Select the VLTracking GameObject in the hierarchy and drag the license file from the project view in the public License File field in the inspector of the TrackingConfiguration component.

Run the Application

  • Start the application in the editor by pressing the play button
  • Start the tracking engine by pressing the Start button.

The example scene lets you select your camera at start.

  • Select the desired camera and proceed

You should see the live camera view and a 3D model of the mini car in Unity's game window now.

  • Next, point the camera to your physical 3D model in order to match it with the rendered one.
  • Once recognized, the 3D model should snap to the physical one in the live video - and be tracked.

If everything worked well, your game view should look like in the picture above.

For demonstration purposes, we merely superimpose the same 3D model that is used for tracking the real mini car in the live image stream. But of course, you can replace this simple augmentation by anything more complex. For this you can exploit the full range of features that Unity offers to you. In contrast to other tracking techniques, model tracking will always use the physical 3D model as a reference. As an advantage, all augmentations will appear right in place as soon as the real object is recognized and tracked in the images, because the coordinate systems of the tracking system and the created virtual content are aligned automatically.

Controlling and customizing the tracking can be done via tracking configurations. Within these JSON-formatted files, you can change tracking targets, fine-tune the tracking parameters, or specify camera or image stream inputs. There exists a pre-configured file for this quick-start demo, which you can use as a starting-point and adapt it to your needs.

Changing the Model

For changing the 3D model you can do the following.

  • Stop the current demo by leaving the play mode.
  • Switch to your project files and open the folder in a file manager, like Finder on Mac or Explorer on Windows.

In order to track another model, you need:

  • A digital 3D model of the physical object you want to track in some readable format (e.g. OBJ; all supported formats are listed here: Model Tracker).
  • The real physical object that corresponds to the 3D model for live tracking.

In our case, we replace the mini car with a sports car model. To do so:

  • Copy the new digital 3D model into the Models folder of your project; we will use this for superimposition.
  • Then, copy the same 3D model additionally into a new Tutorial folder in StreamingAssets/VisionLib folder; this is used for tracking.

From the StreamingAssets/VisionLib/Examples/ModelTracking folder, find the SimpleModelTrackingConfiguration.vl file, create a copy, rename it (e.g. to MyModelTrackingConfiguration.vl) and also move it to the Tutorial folder that you created above. This is the tracking configuration file, which is in JSON format. Please open the file in a text editor and review it there.

"type": "VisionLibTrackerConfig",
"version": 1,
"meta": {
"name": "Simple VisionLib Car Tracker",
"description": "Simple model tracking configuration file",
"author": "Visometry"
"tracker": {
"type": "modelTracker",
"version": 1,
"parameters": {
"modelURI": "project-dir:VLMiniCar.obj",
"useColor": true,
"metric": "m",
"showLineModel": true,
"keyFrameDistance": 5,
"laplaceThreshold": 1,
"normalThreshold": 0.3
  • In the tracker section, look for modelURI and change VLMiniCar.obj (which is the mini car's 3D model) to SportsCar.obj (which is the reference to the new tracking model we've copied into the folder).

Please note the metric parameter. Here, you should define in which scale your 3D model was created. In our case both, the mini car model and the new sports car, are modeled in m. Change this to mm or cm, if your metrics differ.

  • Switch back to Unity.
  • Select the VLTracking GameObject and change the tracking configuration reference to your new file (e.g. MyModelTrackingConfiguration.vl).

We don't want to use the old model for visualization, but the new one instead. To do so, we need to add it to the scene.

  • In the hierarchy under SceneContent, disable the VLMiniCar model.
  • In the project panel, open the Models folder.
  • Drag your new model into the scene, in our case it's the SportsCar.obj asset.
  • Now, select the VLInitCamera prefab under VLTracking in the hierarchy.

This camera defines the model's initial pose that you could see, once the Start button was pressed. Right now, the camera is set nicely for the old mini car model. We need to change this pose, in order to fit the new model. With the camera selected, move it until you get a nice sight on the new model (have a look at the camera preview window in the image above). This position defines where your tracking — and respectively your users later on — should start working with the augmented reality.

Note: Instead of setting the transform of the VLInitCamera directly, you can also adjust the scene view until you have the desired initial view to your object, select the VLInitCamera and press the Align with view button. You might also need to adjust the near/far clipping planes of the camera.

Run the Application again

  • In Unity, press play to start the scene again.
  • Next, press the Start button and select a camera. The new model should now be rendered as you've defined its init pose via the VLInitCamera.
  • Again, match the real and virtual objects by moving the camera. If everything is fine, tracking should work immediately and both, virtual and physical model, should match precisely:

Debug Information

In order to understand what VisionLib's computer vision engine is doing, take a look at the ModelTrackingSetup example. This is covered in more detail in Using the Model Tracking Setup Scene. After pressing the Debug on/off button in this example, you will see a grey-scale image with 'jagged' lines, representing the ongoing 3D model's edge detection. You are able to manipulate the values of tracking parameters from the tracking configuration — including the length of the aforementioned lines — using sliders accessible via the Parameters on/off button. Also, live values for important tracking parameters and statistics are displayed in the bottom left corner of the screen.


Hey, hey! You just tested VisionLib's 3D model tracking and exchanged the demo model.

If this does not work with your model, we recommend reading through the following sections in order to set the parameters right for your tracking case.

Next Steps

Perhaps you might have noticed a warning, stating that your camera has not been calibrated. This is a crucial task, which improves the tracking experience a lot. So for the next step we do recommend calibrating your camera. Therefore, follow this tutorial: Camera Calibration.