The VisionLib deeply integrates the ARKit 1.5 features and experimental features of ARKit 2.0. If you want to take advantage of ARKit on your iOS device you can do this by simply enabling extendibleTracking
in your parameters. If your device ships an A9 or higher processor it will automatically use the SLAM prediction of the device. It is absolutely necessary that your metric of your model fits the real object, since ARKit is tracking in meters. If your .vl is defined to track in mm, it is fine, since the internal metric will be recalculated to fit ARKit. Anyway the units MUST be set right (see also Understanding Tracking Parameters).
If you use ARKit in Unity please do not include the ARKit plugin at the same time. This can lead to significant performance drops.
The following states are available and can be identified through the use of ARKit in the VisionLib:
Tracking State | Tracking Quality | Description |
---|---|---|
lost | greater 0 | The tracker is searching for a pose |
tracked | greater trackingInlierRatio | The tracker is successfully tracking the object |
critical | greater 0 | The tracker is tracking the object using the ARKit pose, the object is in the field of view |
critical | 0 | The tracker is tracking the object using the ARKit pose, the object is NOT in the field of view |
If you set disableExternalSLAM
to true
(false
by default), you can disable the use of the ARKit and switch to the VisionLib internal SLAM (which is not recommended in all cases).
If you have a static object, which does not move while tracking, you should enable the staticScene
parameter. This stabilizes the pose in case the camera is not moved within a certain distance using ARKit. The distance can be configured using the keyFrameDistance
parameter in mm
or overwriting the _staticScenePrefExtPoseDistance
, if you want to control the behavior separately.
By default the "low-res" images of ARKit will be used. Usually Apple uses 1280x720 pixels in this case. The aspect anyway does usually not fit the iPad aspect of 4:3 and thus the image will be cropped. In order to gain the full Field of View (FOV) on iPad devices you may set the experimental enableARKitHighRes
parameter to true. This will use high resolutions, if available on the device (e.g. 1440x1920). A better way for configuring the high resolution is to use the input section:
Please refer to Configuration File Reference.
If ARKit 2.0, if available the VisionLib will allow you saving the WorldMap additionally to your init data (see Initialization: Fast Init & Re-initialization). Enable this feature by adding useExternalSLAMMap:true
to your parameters. It is very useful, if the target object is not being moved (also you may set staticScene:true
). This allows very reliable relocalization after an ARKit map has been built. It can be saved, cleared and reloaded using the initData commands.
You can observe 2 additionally saved init data files named: filename.binz.arkitwd
and filename.binz.arkitwd.relPose
. The state of the world data acquisition state is also reflected in the tracking states as _WorldMappingStatus
. It can have the following values, following the Apple Developer docs:
Tracking States WorldMappingStatus | Apple Name | Description |
---|---|---|
N/A | ARWorldMappingStatusNotAvailable | No world map available. |
Limited | ARWorldMappingStatusLimited | The world tracking has not been mapped sufficiently around the current device. |
LimitedDetected | ARWorldMappingStatusLimited & Tracking Valid | The world tracking has not been mapped sufficiently around the current device but a pose corresponding to the VisionLib anchors have been found. |
Extending | ARWorldMappingStatusExtending | Visited areas have already been mapped but mapping is still going on. |
Mapped | ARWorldMappingStatusMapped | The world has been adequately mapped the visible areas. |
If you are working with a saved map, you can recognize a valid recognized pose already when you switch to the WorldMappingStatus LimitedDetected
.
It is useful enabling the plane detection, when using ARKit for creating automatic plane anchors. You can enable those using: externalSLAMPlaneDetect
: 0 = no plane detection (default), 1=detect horizontal planes, 2 = detect vertical planes, 3 = detect both
Enable the externalSLAMDraw
parameter in order to see the recognized feature points and line boundaries of all the plane anchors.