Simple iOS Credit Card Tracker
Level: Basic


The purpose of this application is to demonstrate the step by step integration of the VisionLib SDK into your iOS Objective-C application. A simple test model everybody has at hand is any credit card sized card. We will create an application based on Xcode's Game template. It will be successively extended until we have a fully functional AR application augmenting a model over your credit card.


  • Understand the directory structure of the vlSDK
  • Setting up the vlSDK in a new project
  • Creating the main object and implement a delegate protocol for the vlSDK
  • Adding your license
  • Display the camera image
  • Integrate a tracking configuration into your application
  • Perform some error checking after initialization
  • Display tracking quality and other parameters in real time


  • A mac for development having Xcode installed.
  • For iOS development,
    • an iOS device of the 64bit generation
    • an Apple developer account
  • The downloaded vlSDK
  • A valid license file


Make sure you have downloaded the latest version of the vlSDK for Apple. You can find the following file structure inside the provided ZIP file.

  • vlSDK
    • iOS
      • Frameworks
        • vlSDK.framework - The VisionLib SDK framework for iOS
      • Examples
        • vlSDK_simpleSceneKitExample - A simple example for iOS
    • MacOS
      • Frameworks
        • vlSDK.framework - The VisionLib SDK framework for macOS
        • embedded
          • vlSDK.framework - The embedded version of the framework
      • Examples
        • vlSDK_simpleSceneKitExample - A simple example for macOS
    • tracker - A global folder containing some example tracker
    • license - A convenient folder for placing the license into
    • Windows Not in focus right now...
      • ...


You can also skip the following sections and just open the example project in Xcode. Note that you may miss the essential parts between the lines.

If you want to start right-away, please copy your license.xml into the license folder. You'll always need a valid license file in order to start the SDK.

Create a Project

Start Xcode and create a new project of the Game type using iOS.


After that, give the project a name. In this example we named it simpleCreditCardTracker.


The project overview will open up. In order to test your simple application, you need to sign the application with your Apple developer team.


Add the vlSDK Framework

Now drop the vlSDK.framework file into your project and add the other following frameworks using the + key in the project settings.


If you have NO C++ code in your application right now, make sure to add libc++.tbd as a linker dependency too. Additionally, you need to set the path to the location where the linker should search the framework later on.


You can add it using the following path: "$(SRCROOT)/../../Frameworks". In this case the Xcode project lies at MacOS/Examples/simpleCreditCardTracker

You should also change the following settings in the same panel:

  • Enable Bitcode -> FALSE
  • Valid Architectures -> Only ARM64 (remove the others)

Now you should already be able to compile your application and start it, which will present you a rotating space ship. We will add some AR in the next steps, connecting this ship to your credit card.

Add the Tracking Configuration

There are three example tracking configurations in the tracker folder of the example path:

  • A simple credit card sized object
  • The VisionLib example car as states in the Unity Tutorial (UnitySDK Quick Start)
  • A A4 sized paper sheet

You can also create your own tracking configuration using our tracking configurator, or just by using a simple text editor.

For our example, just add the folder "Credit Card Tracker" to the project.

Add the License

You also need to add the license folder for the vlSDK to work.

The file configuration of your project view in Xcode should now look similar to this.


Be sure that you have tagged the Target Membership flags for both folders (license and tracker), so they will be included as resources in the resulting app.


Let's add some Code

Open the main file GameViewController.h and add the following three lines:

#import <GLKit/GLKit.h>
@interface GameViewController : UIViewController <SCNSceneRendererDelegate, vlFrameListenerInterface>

Import the SDK and add a delegate interface to receive notifications from the renderer and the VisionLib SDK delegate interface.

Now switch to the GameViewController.m implementation and add the vlSDK variable:

@interface GameViewController(){
// Variable for VisionLib SDK Objective-C interface
vlSDK *visionLibSDK;

At the end of the viewDidLoad function add the following snippet:

// Add VisionLib initialization code
// Get the license file to be used.
NSString *licenseFile = [[NSBundle mainBundle] pathForResource:@ "license/license" ofType:@"xml"];
// Set the file uri of the configuration file
NSString *trackingConfigURI = [NSString stringWithFormat:@"file://%@",[[NSBundle mainBundle]
pathForResource:@"creditCardTracker/creditcard" ofType:@"vl"]];
// Now init with our URI
visionLibSDK = [[vlSDK alloc] initTrackerWithURI:trackingConfigURI andLicensePath:licenseFile andDelegate:self];
// Configure SDK to preinvert the matrices in order to use them directly with OpenGL or Metal.
[visionLibSDK configureExtrinsicCameraInverted:TRUE];
// Run it
[visionLibSDK run];
// register for render loop, we do this AFTER initializing the VisionLib
scnView.delegate = self;


We create an URI that will allow the SDK to access your tracker. In this case it is a file URI that allows access to your creditCardTracker, which resides in your resources as well. NOTE: Here we use a URI which has a file://` as prefix to your path on iOS.

We now initialize the VisionLibSDK variable by passing the uri of the configuration file, the uri of the license file and setting this controller as a delegate to receive relevant data from the VisionLib.

As we do not want to deal with matrix multiplications and inversions right now, we tell the VisionLib to pass preinverted matrices, which is working nicely with the SceneKit implementation.

We also tell the VisionLib to run. This creates a thread for your application and starts tracking. Anyway, we need to register some methods of SCNSceneRendererDelegate to synchronize the display of our image with the renderer. Therefore we add the following methods to GameViewController.m:

-(void)renderer:(id<SCNSceneRenderer>)renderer updateAtTime:(NSTimeInterval)time
[visionLibSDK process];

We want to get the image and display it in the background. We achieve this by simply implementing the delegate callback like this:

-(void)onMetalImageTexture:(id<MTLTexture>)texture withRotationMatrix:(float *)m;
SCNView *scnView = (SCNView *)self.view;
// configure the view and apply rotation that will change
scnView.scene.background.contents = texture;
scnView.scene.background.contentsTransform = SCNMatrix4FromGLKMatrix4(GLKMatrix4MakeWithArray(m));

As a more generic approach (which also works on macOS) you might preferr this code which cannot coexist with the onMetalImageTexture variant.

-(void)onCGImageRef:(CGImageRef)texture withRotationMatrix:(float *)m
SCNView *scnView = (SCNView *)self.view;
// configure the view and apply rotation that will change
scnView.scene.background.contents = (id)CFBridgingRelease(texture);;
scnView.scene.background.contentsTransform = SCNMatrix4FromGLKMatrix4(GLKMatrix4MakeWithArray(m));

The first line displays the current scene and the second line sets the texture for the background. The third line actually applies some transformation on your image. This is required if you support device rotation.

Before we can start the app, you need to set the NSCameraUsageDescriptionin the Info settings of the application. You can set a user notification about the camera usage here.


You will now be able to start the application and run it on your iOS device.

If everything worked fine, you can see the camera image along with the license watermark.

NOTE: If you are using a trial license, the number of tracking starts per deployment is limited to five times. After that, you have to redeploy and restart the application.

Implement the Tracking

Before adding some listeners to reflect the cameras position, you first need to understand the method of model tracking. In the current application state, you'll need an initial pose for your credit card, that guides the user on how to align the virtual with the real model.

Therefore, insert a custom init pose or get the one proposed in the configuration file that we passed on startup. Add the following variables to the GameViewController interface:

@interface GameViewController(){
// Variable for VisionLib SDK Objective-C interface
vlSDK *visionLibSDK;
// Add more global variables for scene and camera, since we want to manipulate them
SCNNode *cameraNode;
// Save the initial pose
BOOL gotInitPose;
SCNMatrix4 initPose;

Also add a new delegate function to request the initial pose:

[visionLibSDK getInitPose];

Add the following to be able to set the requested init pose:

-(void)onInitPoseMatrix:(float *)m
// We save the first init pose.... and set the actual viewpoint to it
if (!gotInitPose){
SCNView *scnView = (SCNView *)self.view;
initPose = SCNMatrix4FromGLKMatrix4(GLKMatrix4MakeWithArray(m));;
scnView.allowsCameraControl = YES;
cameraNode.transform = initPose;
scnView.pointOfView = cameraNode;
gotInitPose = true;

This saves the provided init pose using the pose that was first passed from the library. You still need to make some changes to the code in Apple's template:

  • Remove the declaration of cameraNode to use the globally defined object –> cameraNode = [SCNNode node];
  • Load the creditCardTracker/creditcard.obj instead of the art.scnassets/ship.scn
  • Comment the line which triggers the rotation: // [ship runAction:[SCNAction repeatActionForever:[SCNAction rotateByX:0 y:2 z:0 duration:1]]];
  • Instead of triggering the animation using the runAction command, trigger the scene animation by enabling scnView.playing = TRUE; in the onViewDidLoad function.

Then, the extrinsic (pose) and intrinsic (internal camera parameters) should be applied to the scene.

To set the model view matrix (representing the camera position of your system) and the projection matrix (which represents your internal parameters of your camera system), you need to add two more delegates:

-(void)onIntrinsicData:(float *)data;
{ = SCNMatrix4FromGLKMatrix4(GLKMatrix4MakeWithArray(data));
-(void)onExtrinsicData:(float *)data isValid:(bool)valid;
// set Pose from VisionLib
if (valid)
cameraNode.transform = SCNMatrix4FromGLKMatrix4(GLKMatrix4MakeWithArray(data));
} else {
cameraNode.transform = initPose;

In the onIntrinsicData routine, set the projection transform for the current frame.

In the "onExtrinsicData", you get the pose of the camera as well as a boolean flag, which shows the validity of the pose. If the pose is invalid, the extrinsic data usually represents the last valid pose. If the pose is not recognized, the previously saved initial pose is set to guide the user back to the beginning. Of course, you can also set the old pose as a new init pose, etc.. . Feel free to play around with the available functions.


If everything went fine, your application should work like this: Aligning the virtual credit card model with the real one augments the real model with a rotating space ship.

Congratulations, you just created your first simple iPhone app with VisionLib!