documentation

External SLAM (ARKit/ARCore) Support

Level: Intermediate

VisionLib deeply integrates support for external SLAM on mobile devices (ARKit 1.5 features and experimental features of ARKit 2.0 on iOS, ARCore features on Android). If you want to take advantage of external SLAM you can do this by simply enabling extendibleTracking in your parameters. If your device supports ARKit (ships an A9 or higher processor) or ARCore (find a list of supported devices here) it will automatically use the SLAM prediction of the device.

It is absolutely necessary that the metric of your model fits the real object, since the external SLAM will locate the camera in real world units. Therefore, the units MUST be set right (see also Mandatory Initial Parameters).

Note: If you use VisionLib with external SLAM in Unity, please do not additionally include features from the ARKit or ARCore package. This can lead to significant performance drops.

Additional parameters

Static Scenery

If you have a static object, which does not move while tracking, you should enable the staticScene parameter. More details regarding this parameter can be found at the Optional Tracking Parameters page.

Using a different resolutions

Please refer to Configuration File Reference for setting the resolution. You might add this section to your tracking configuration in order to do so:

"input":{
"imageSources": [{
"name": "camera0",
"type": "camera",
"data": {
"resolution": "auto"
}
}],
"useImageSource":"camera0"
},

If you experience problems due to high resolutions, please set the resolution directly to widthxheight (e.g. "1280x720"). It will be chosen, if available, otherwise the first offered resolution of the device will be taken.

Experimental Features (ARKit only)

Saving the world point cloud in ARKit 2.0 (only available in iOS 12+)

If ARKit 2.0, if available the VisionLib will allow you saving the WorldMap additionally to your init data (see Initialization: Fast Init & Re-initialization). Enable this feature by adding useExternalSLAMMap:true to your parameters. It is very useful, if the target object is not being moved (also you may set staticScene:true). This allows very reliable relocalization after an ARKit map has been built. It can be saved, cleared and reloaded using the initData commands.

You can observe two additionally saved init data files named: filename_worldMap.arkitwd and filename_worldMapFromModelTransform.xml. The state of the world data acquisition state is also reflected in the tracking states as _WorldMappingStatus. It can have the following values, following the Apple Developer docs:

Tracking States WorldMappingStatusApple NameDescription
N/AARWorldMappingStatusNotAvailableNo world map available.
LimitedARWorldMappingStatusLimitedThe world tracking has not been mapped sufficiently around the current device.
LimitedDetectedARWorldMappingStatusLimited & Tracking ValidThe world tracking has not been mapped sufficiently around the current device but a pose corresponding to the VisionLib anchors have been found.
ExtendingARWorldMappingStatusExtendingVisited areas have already been mapped but mapping is still going on.
MappedARWorldMappingStatusMappedThe world has been adequately mapped the visible areas.

If you are working with a saved map, you can recognize a valid recognized pose already when you switch to the WorldMappingStatus LimitedDetected.

Debugging features

Plane Detection

It is useful enabling the plane detection, when using ARKit for creating automatic plane anchors. You can enable those using: externalSLAMPlaneDetect: 0 = no plane detection (default), 1=detect horizontal planes, 2 = detect vertical planes, 3 = detect both

Showing detected points

Enable the externalSLAMDraw parameter in order to see the recognized feature points and line boundaries of all the plane anchors.