This page concerns the Vuforia Engine API version 9.8 and earlier. It has been deprecated and will no longer be actively updated.
This document demonstrates general usage of the Model Targets API, with code snippets for commonly performed tasks. For a more general introduction to working with Model Targets, see the Model Targets Overview and the Model Targets API Overview.
A Model Target database consists of a single Model Target with one or more Guide Views or the database can be trained as an Advanced Model Target Database capable of containing one or more Model Targets; each with one or more advanced Guide Views that allows a detection range up to 360˚.
Model Targets are tracked using the ObjectTracker and therefore are used in a very similar way to Image Targets, VuMarks, Cylinder Targets, and Multi Targets. Unique to Model Targets is the GuideView, which defines an initialization pose and therefore requires an additional rendered overlay that guides the user to this pose.
Adding a License Key
If you want to use your own Model Targets created using the Model Target Generator, you will need to have an appropriate license key. See How to Add a License Key to Your Vuforia App for more information.
Using Model Targets - General
Init and Destroy the ObjectTracker
To start working with Model Targets, initialize the ObjectTracker:
TrackerManager &
trackerManager = TrackerManager::getInstance();
ObjectTracker *objectTracker = static_cast & lt;
ObjectTracker *>
(trackerManager.initTracker(
Vuforia::ObjectTracker::getClassType()));
To destroy the ObjectTracker:
TrackerManager &
trackerManager = TrackerManager::getInstance();
trackerManager.deinitTracker(Vuforia::ObjectTracker::getClassType());
Start and Stop the Object Tracker
Start the Object Tracker:
objectTracker->start();
Stop the Object Tracker:
objectTracker->stop();
Loading a Model Target database
Create a new DataSet
to hold the Model Target via the ObjectTracker
instance, and load the data files:
const char *modelTargetDatasetPath = "
VuforiaMars_ModelTarget360.xml ";
DataSet *modelTargetDataset = objectTracker - >
createDataSet();
if (DataSet::exists(modelTargetDatasetPath, Vuforia::STORAGE_APPRESOURCE))
modelTargetDataset - >
load(modelTragetDatasetPath, Vuforia::STORAGE_APPRESOURCE);
Activate the Dataset to enable detection and tracking:
objectTracker->activateDataSet(modelTargetDataset);
Deactivate the Dataset after using the tracker (after stopping the tracker):
objectTracker->deactivateDataSet(modelTargetDataset);
Finally, the Dataset must be destroyed after usage (after stopping the tracker and deactivating the Dataset):
objectTracker->destroyDataSet(modelTargetDataset);
Using Model Target Results
Your general app workflow
The diagram above illustrates the recommended workflow you can implement to initialize, check and display your model Target application. This approach allows for the execution of the right behavior independently of the type of Model Target. Use the Status of the TrackableResult to check on the status of the Model Targets database and choose to render a Guide View and any augmentations. For more general information on Tracking states, please see Interpreting Tracking API Results.
Model Target Database Behavior
The workflow is universal for all types of Model Target databases (Standard Model Targets and Advanced Model Targets) as the app only needs to observe the State for each camera frame to determine what it should do without having to consider the type of Model Target database. In more detail:
- Standard Model Target databases contain one Model Target with one or more Guide Views. In this case, a
TrackableResult
is immediately pushed on the State after activation withgetStatus()==NO_POSE
andgetStatusInfo()==NO_DETECTION_RECOMMENDING_GUIDANCE
. There, you should display the specific Guide View to assist the users in moving to the right position. Once the alignment is achieved, the status changes toTRACKED
and augmentations can be rendered. - Advanced Model Target databases contain one or more Model Targets each with a recognition range up to 360°. These are initialized on the Vuforia State with
getStatus()==NO_POSE
andgetStatusInfo()==INITIALIZING
when the database is activated. Once a specific target is recognized it is immediately activated and the corresponding TrackableResult changes togetStatus()==TRACKED
andgetStatusInfo()==NORMAL
. Before recognition of a specific view or object, we recommend employing a visual indication on the screen to inform users to look around until the Model Target object is detected as shown in the diagram above. See Model Target Guide View for more information on Viewfinder UX and Symbolic Guide Views.
Checking for results and rendering Guide Views and augmentations
The following code snippet gives an example of how to inspect the State
to find out whether a guide view or an augmentation needs to be displayed. This code represents the recommended unified workflow that works independently of the type of Model Target database.
void render()
{
const Vuforia::State state = Vuforia::TrackerManager::getInstance().getStateUpdater().updateState();
bool hadModelTargetResult = false;
const Vuforia::ModelTarget *modelRequiringGuidance = nullptr;
int targetsInitializing = 0;
for (TrackableResult *trackableResult : state.getTrackableResults())
{
// Check for a Model Target tracked pose.
if (trackableResult->isOfType(ModelTargetResult::getClassType()))
{
const Vuforia::ModelTargetResult *mtResult = static_cast<const>(res);
const Vuforia::ModelTarget &modelTarget = mtResult->getTrackable();
if (res->getStatus() == Vuforia::TrackableResult::STATUS::TRACKED || res->getStatus() == Vuforia::TrackableResult::STATUS::EXTENDED_TRACKED)
{
hadModelTargetResult = true;
// Render Augmentation
// Get a model-view matrix representing the pose of the Object Target.
// Augmentations should be rendered using this model-view matrix as a baseline.
Vuforia::Matrix44F modelViewMatrix = Vuforia::Tool::convertPose2GLMatrix(mtResult->getPose());
...
}
else if (res->getStatusInfo() == Vuforia::TrackableResult::STATUS_INFO::NO_DETECTION_RECOMMENDING_GUIDANCE)
{
// store target requiring guide view
modelRequiringGuidance = &modelTarget;
}
else if (res->getStatusInfo() == Vuforia::TrackableResult::STATUS_INFO::INITIALIZING)
{
// Remember number of targets waiting to be recognized
++targetsInitializing;
}
}
}
if (!hadModelTargetResult)
{
if (modelRequiringGuidance)
{
// render guide view
int activeIndex = modelRequiringGuidance->getActiveGuideViewIndex();
const Vuforia::GuideView *activeGuideView = modelRequiringGuidance->getGuideViews().at(activeIndex);
...
}
else if (targetsInitializing == 1)
{
// render symbolic guide view for specific target (optional)
}
else if (targetsInitializing > 1)
{
// render generic viewfinder UX
}
}
}
Render the Guide View
In most use cases, you will render the Guide View using the default image from the GuideView
instance, drawn with a textured rectangle overlay, as part of your render loop. Note that to guarantee correct alignment, it is important to either render the Guide View such that its longer side matches the longer side of the camera image or to use the Guide View and camera intrinsics to scale the image appropriately. The following code demonstrates the first variant:
// Initialization code
if (guideView != nullptr & amp; & guideView - > getImage() != nullptr)
{
textureId = MyApp::createTexture(const_cast(guideView - > getImage()));
}
MyApp::renderVideoBackground();
// Only display the guide when you don’t have tracking
// scale your guide view with the video background rendering
if (guideView != nullptr & amp; & guideView - > getImage() != nullptr & amp; & state.getTrackableResults().size() == 0)
{
float guideViewAspectRatio = (float)guideView - >
getImage() - >
getWidth() /
guideView -
>
getImage() - >
getHeight();
float cameraAspectRatio = (float)viewport.data[2] / viewport.data[3];
float planeDistance = 0.01f;
float fieldOfView = Vuforia::CameraDevice::getInstance().getCameraCalibration().getFieldOfViewRads().data[1];
float nearPlaneHeight = 2.0f * planeDistance * tanf(fieldOfView * 0.5f);
float nearPlaneWidth = nearPlaneHeight * cameraAspectRatio;
float planeWidth;
float planeHeight;
if (guideViewAspectRatio > = 1.0f & amp; & cameraAspectRatio > = 1.0f) // guideview landscape, camera landscape
{
// scale so that the long side of the camera (width)
// is the same length as guideview width
planeWidth = nearPlaneWidth;
planeHeight = planeWidth / guideViewAspectRatio;
}
else if (guideViewAspectRatio & lt; 1.0f & amp; & cameraAspectRatio & lt; 1.0f) // guideview portrait, camera portrait
{
// scale so that the long side of the camera (height)
// is the same length as guideview height
planeHeight = nearPlaneHeight;
planeWidth = planeHeight * guideViewAspectRatio;
}
else if (cameraAspectRatio & lt; 1.0f) // guideview landscape, camera portrait
{
// scale so that the long side of the camera (height)
// is the same length as guideview width
planeWidth = nearPlaneHeight;
planeHeight = planeWidth / guideViewAspectRatio;
}
else // guideview portrait, camera landscape
{
// scale so that the long side of the camera (width)
// is the same length as guideview height
planeHeight = nearPlaneWidth;
planeWidth = planeHeight * guideViewAspectRatio;
}
// normalize world space plane sizes into view space again
Vuforia::Vec2F scale = Vuforia::Vec2F(planeWidth / nearPlaneWidth, -planeHeight / nearPlaneHeight);
// render the guide view image using an orthographic projection
MyApp::renderRectangleTextured(scale, color, textureId);
}
Native API references
The ModelTarget class extends the base ObjectTarget with methods related to querying the bounding box and associated guide views.
The GuideView class provides access to the 2D guide view images and the associated pose and camera intrinsics. It also allows overriding the detection pose for scenarios where the pose is to be changed dynamically at runtime.
The TrackableResult class provides state information on the Trackable and its Type.
The ObjectTracker class tracks the object in the real world of all target types. Create, activate, and deactivate DataSet instances with this class.