documentation

Image Recorder

Tracking configuration parameters for setting up a simple image recorder configuration.

It can be useful to record just a bunch of images for later use, but with the exact same camera parameters as they are used in the VisionLib SDK. Typical cases are:

  • Calibrating your custom camera parameters
  • Creation of tests
  • Getting feedback from the user.
  • Simulation of On-Site scenarios Off-Site! This especially can save a lot of money and time.

Since version 19.3.1 you are able to save image sequences along with intrinsic camera parameters. When using iOS devices for recording, even ARKit relevant pose estimations can be saved along with the images.

The image recorder configuration is as simple as it sounds and consists a few parameters only. It can be used as stand alone application. The model tracker configuration also incorporates the same parameters in order to allow also recording while tracking.

Configuration File Parameters

The following parameters can be set inside the image recorder configuration file:

Parameter Type Default value Example
recordOnStartup bool optional (true) - (false, when used in the model tracker config) "recordOnStartup": true
Will start the recording of the images, when the tracking pipe starts running. Images will NOT be recorded, when the SDK is paused.
recordToNewDir bool optional (true) "recordToNewDir": false

Will start with a new directory when the tracking configuration is initialized. The given URIs parent directory will be taken and an incrementing number will be appended until a new folder can be created.

recordURIPrefix string optional ("local-storage-dir:/VisionLib/records/record/image_") "recordURIPrefix": "local-storage-dir:vlSDK/myImages/image_"

Depicts the URI, where the images are written to. local-storage-dir: refers to a scheme pointing to a user reachable folder on the local device. On Windows and macOS you will find the data in the current document folder of the user. On iOS and Android the data is placed in the applications document folder. Please notice that you have to name a prefix for an incomplete filename as well. In the default case the filenames of images are: image_00000.ext. On every frame the appended 5 digits are incremented.

recordImageType string optional ("jpg") "recordImageType": "png"

Depicts image type extension and file format to be written regarding the specified URI. Allowed values are only png and `jpg'. NOTE: Writing JPG can be way more performant and reduces the consumption of the occupied disk space. PNG files on the other hand do not loose any quality.

extendibleTracking boolean optional (false) "extendibleTracking": true

When turned on, the device will initialize with external SLAM routines if available. On iOS and Android devices, ARKit/ARCore will be used in this case and the external SLAM poses are saved as well. It will eventually also decide for a resolution provided by the ARKit/ARCore session.

Example Configuration File

{
"type": "VisionLibTrackerConfig",
"version": 1,
"meta": {
"name": "Simple Recording Example",
"description": "Simple configuration file allowing for recording image sequences",
"author": "Visometry GmbH"
},
"tracker": {
"type": "imageRecorder",
"version": 1,
"parameters": {
"recordOnStartup": true,
"recordToNewDir": true,
"recordURIPrefix": "local-storage-dir:vlSDK/myProject/image_",
"recordImageType": "png"
}
}
}

Using Scheme Parameters

You can easily customize the recording of the images in your configuration by passing scheme parameters. So every single parameters is reflected as a scheme parameter and will overwrite existing or default ones defined in the tracking file.

Example

Writing an image sequence when using an model tracker configuration

myModelTracker.vl?tracker.parameters.debugLevel=1&tracker.parameters.recordOnStartup=true&tracker.parameters.recordURIPrefix=local-storage-dir:vlSDK/myModelTrackerProject/image_

Writing an image sequence from a mobile phone or application to your server

This is a small tutorial for using the image recorder in conjunction with your WiFi network and sending the images directly to your pc instead of writing it to the device (This is not enabled for UWP, which includes HoloLens. If you require this feature, please contact us).

The main principle relies in the fact that you will simply replace the recordURIPrefix in the configuration file by your network address allowing for posting the image via HTTP to your application and write it there to your hard disk.

Prerequisites

  • You will need to have node installed on your system.
  • A working example of the image recorder example unity project.

Server in node.js

In order to setup the server please enter the following commands in your new project directory:

npm init
npm install express --save
npm install body-parser --save

Create a file named index.js and put the following code inside:

var express = require('express');
var http = require('http');
var app = express();
var bodyParser = require('body-parser');
var rawParser = bodyParser.raw({ type: 'application/*' ,limit: '5000mb'});
http.createServer(app).listen(2525, function() {
console.log('Express Endpoint server listening on port 2525');
});
app.use(rawParser);
app.post('/upload/:id',rawParser, function(req, res) {
var filename = req.params.id;
console.log("Request file:"+filename+"->"+req.get('Content-Type')+" Len:"+req.get('Content-Length'));
console.log("Headers:"+JSON.stringify(req.headers));
if (!req.body) {
console.log("No Body available for file:"+filename);
return res.sendStatus(400);
}
var fs = require("fs");
var newFilename = __dirname+"/images/"+filename;
console.log("Writing:" + newFilename + " with len:"+req.body.length);
fs.writeFileSync(newFilename, req.body);
return res.sendStatus(200);
});

You maybe need to adapt the code a little bit for your needs. (e.g. the paths, when using windows)

Be sure that a directory named images exists.

You can now start the server by calling node index.js.

Modifying the tracking configuration

Open your used tracking configuration, or if you are using the image recorder example in Unity, copy and open imageRecorder.vl under StreamingAssets/VisionLib/Examples/ImageRecorder. In your tracking configuration, set recordToNewDir to false and adjust the recordURIPrefix like seen below:

"tracker": {
"type": "imageRecorder",
"version": 1,
"parameters": {
"recordOnStartup": true,
"recordToNewDir": false,
"recordURIPrefix": "http://TheIpOfYourPC:2525/upload/image_",
"recordImageType": "jpg"
}
}

When deploying to the iOS or Android device, you will be able to receive the images.

This will only work with mobile devices, since Unity as a player does NOT allow the sending of the images from within the editor.