This page concerns the Vuforia Engine API version 9.8 and earlier. It has been deprecated and will no longer be actively updated.
Overview
RenderingPrimitives and the Renderer classes are the two main components providing AR Rendering support with the Vuforia API.
The Renderer class is here to support video background texture update, and will use graphics resource (e.g. texture, rendering calls).
On the other hand, the RenderingPrimitives class is renderer-agnostic (only CPU data) and should be used to support all the logic you need for a video-based monocular AR rendering.
The RenderingPrimitives are built on the concept of the “rendering” building blocks -the “primitives” in RenderingPrimitives - that you can easily combine to render your scene. These primitives follow some of the common rendering concepts:
- Viewport for your augmentation,
- Projection Matrix for your augmentation,
- Primitives object for video background rendering,
- Projection Matrix for video background,
- Etc.
The RenderingPrimitives are built using information from the Tracking state and also use device information (screen orientation, type of devices). You can see them as the equivalent of the Tracking state for Tracking but for Rendering (Rendering State).
Sections
Rendering Primitives Usage for Mobile AR
Mobile AR rendering is technically equivalent to Monocular Video-See through AR rendering. This implies two major steps: rendering video in the background, rendering augmentation in overlay.
In your rendering application, the following workflow can be used to realize that:
- Setup your Viewport (via RenderingPrimitives)
- Video Background Rendering: orthographic rendering of a quad mesh with a video texture
- Update the video background texture from the Tracking State (via Renderer)
- Retrieve video background rendering primitives such as video background mesh, video background projection matrix (via RenderingPrimitives)
- Render your video background mesh
- Augmentation Rendering:
- Retrieve projection matrix (via RenderingPrimitives)
- Retrieve device pose matrix, invert it to get the View matrix
- For each detected trackable, compute the Model matrix
- Combine Projection matrix, View matrix, and Model view matrix to render trackable
For all these steps, you can use the Vuforia::Tool
class and the MathUtils
(sample app) to support additional mathematical transformation.
Example code of the viewport setup
// 1. Setup the viewport
// We're writing directly to the screen, so the viewport is relative to the screen
Vuforia:Vec4I viewport = renderingPrimitives->getViewport(Vuforia::VIEW_SINGULAR);
// [Graphics API specific code to setup the viewport using OpenGL, Metal or DirectX]
// 2. Video Background rendering
// 2.a Update video background texture
// [..]
// 2.b Retrieve video background rendering primitives
if (drawVideo)
{
Vuforia::Matrix44F vbProjectionMatrix = Vuforia::Tool::convert2GLMatrix(renderingPrimitives->getVideoBackgroundProjectionMatrix(Vuforia::VIEW_SINGULAR));
const Vuforia::Mesh& vbMesh =
renderingPrimitives->getVideoBackgroundMesh(Vuforia::VIEW_SINGULAR);
// 2.c Rendering Video background mesh
// Graphics API specific code to render the video background mesh using OpenGL, Metal or DirectX]
}
// 3 Augmentation Rendering
// 3.1 Retrieve projection matrix
// Retrieve the projection matrix to use for the augmentation
Vuforia::Matrix44F projectionMatrix = Vuforia::Tool::convertPerspectiveProjection2GLMatrix(
renderingPrimitives->getProjectionMatrix(Vuforia::VIEW_SINGULAR, state.getCameraCalibration()), NEAR_PLANE, FAR_PLANE);
// 3.2 Compute View Matrix
Vuforia::Matrix44F viewMatrix = MathUtils::Matrix44FIdentity();
// read device pose from the state and create view matrix in World CS
if (state.getDeviceTrackableResult() != nullptr && state.getDeviceTrackableResult()->getStatus() != Vuforia::TrackableResult::NO_POSE)
{
Vuforia::Matrix44F deviceMatrix = Vuforia::Tool::convertPose2GLMatrix(state.getDeviceTrackableResult()->getPose());
// [invert deviceMatrix to get view matrix]
// viewMatrix =
}
// 3.2 Compute Model Matrix for each trackable
const auto& trackableResultList = state.getTrackableResults();
for (const auto& trackableResult : trackableResultList)
{
if (trackableResult->isOfType(Vuforia::ImageTargetResult::getClassType()))
{
const Vuforia::ImageTargetResult* itResult = static_cast<const Vuforia::ImageTargetResult*>(trackableResult);
const Vuforia::ImageTarget& it = itResult->getTrackable();
Vuforia::Matrix44F modelMatrix = Vuforia::Tool::convertPose2GLMatrix(trackableResult->getPose());
Vuforia::Matrix44F modelViewMatrix;
MathUtils::multiplyMatrix(viewMatrix, modelMatrix, modelViewMatrix);
// 3.3 render your model using combination of projection matrix and modelViewMatrix
// [Graphics API specific code to render the augmentation using OpenGL, Metal or DirectX]
}
}
Using the Render Primitive Sample Class
This class encapsulates the RenderingPrimitives usage, allowing Vuforia Engine to smoothly render migration from old native apps to newer implementation. Changing from the deprecated APIs to the new APIs requires just a few steps.
SampleAppRendererControl
. This interface is implemented by the Sample Renderer class. The SampleAppRenderer
class calls this method to render the scene. The method requires two parameters: state to get the trackables results and the projection matrix for the current view.
SampleAppRenderer
. The following stores a reference to the rendering primitives so as to avoid updating it every cycle. The value is updated only when a configuration change, such as rotation or going to the background, occurs:
private RenderingPrimitives mRenderingPrimitives = null;
This is used to call renderFrame (State, projectionMatrix)
from the Renderer:
private SampleAppRendererControl mRenderingInterface = null;
These are near and far planes. The plane values are initialized in the constructor but can be changed using setNearFarPlanes(near, far)
:
private float mNearPlane = -1.0f;
private float mFarPlane = -1.0f;
Texture used for the video background rendering:
private GLTextureUnit videoBackgroundTex = null;
Sets the near and far planes and keeps a reference to the activity and the interface to call renderFrame
when required:
public SampleAppRenderer(SampleAppRendererControl renderingInterface,
Activity activity, float nearPlane, float farPlane)
{
...
Device device = Device.getInstance();
On surface created, initialize the shaders for the video background rendering:
public void onSurfaceCreated()
{
initRendering();
}
Called when a screen size change occurs to update the configuration, orientation and dimensions. The rendering primitives are updated to get the correct values for the new configuration:
public void
onConfigurationChanged(boolean isARActive)
{
updateActivityOrientation();
storeScreenDimensions();
configureVideoBackground();
if (!mIsRenderingInit)
{
mRenderingInterface.initRendering();
mIsRenderingInit = true;
}
}
Gets the state to pass it to the renderFrame
method:
State state;
// Get our current state
state = TrackerManager.getInstance().getStateUpdater().updateState();
mRenderer.begin(state);
Gets the viewport values and get the projection matrix. Calls renderFrame
, passing the current state and the resulting projection matrix. Renders the video background if the current view is not the postprocess one:
Vec4I viewport;
viewport = mRenderingPrimitives.getViewport(VIEW.VIEW_SINGULAR);
GLES20.glViewport(viewport.getData()[0],
viewport.getData()[1],
viewport.getData()[2],
viewport.getData()[3]);
float projectionMatrixGL[] = Tool.convertPerspectiveProjection2GLMatrix(
mRenderingPrimitives.getProjectionMatrix
(VIEW.VIEW_SINGULAR, state.getCameraCalibration()), mNearPlane, mFarPlane).getData();
mRenderingInterface.renderFrame(state, projectionMatrix);
}
Binds the video background texture provided from the SDK to render it:
int vbVideoTextureUnit = 0;
videoBackgroundTex.setTextureUnit(vbVideoTextureUnit);
if (!mRenderer.updateVideoBackgroundTexture(videoBackgroundTex))
{
return;
}
Gets the projection matrix specifically to render the video background:
float[] vbProjectionMatrix = Tool.convert2GLMatrix(
mRenderingPrimitives.getVideoBackgroundProjectionMatrix(
VIEW.VIEW_SINGULAR)).getData();
Gets the mesh where the texture previously returned is used to render the video background:
Mesh vbMesh = mRenderingPrimitives.getVideoBackgroundMesh(VIEW.VIEW_SINGULAR);
After setting the shader, renders the video background, applying the projection matrix returned for that specific view to render it correctly with the right aspect ratio:
// Then, we issue the render call
GLES20.glDrawElements(GLES20.GL_TRIANGLES,
vbMesh.getNumTriangles() * 3,
GLES20.GL_UNSIGNED_SHORT,
vbMesh.getTriangles().asShortBuffer());
Migrate Native Rendering to Use Rendering Primitives
The following steps document the code changes necessary to migrate a project in order to use the Render Primitives API via the SampleAppRenderer
class.
- Copy
SampleAppRenderer
andSampleAppRendererControl
to the SampleApplication folder. - Add the following imports to the Renderer class:
import com.vuforia.Device;
import com.vuforia.samples.SampleApplication.SampleAppRenderer;
import com.vuforia.samples.SampleApplication.SampleAppRendererControl;
- Implement
SampleAppRendererControl
in the Renderer class:
public class MultiTargetRenderer implements GLSurfaceView.Renderer, SampleAppRendererControl
- Add a
SampleAppRenderer
instance:
SampleAppRenderer mSampleAppRenderer;
- At the end of
onSurfaceCreated
, initialize the instance:
// SampleAppRenderer used to encapsulate the use of RenderingPrimitives setting
// the device mode AR/VR and stereo mode
mSampleAppRenderer = new SampleAppRenderer(this, vuforiaAppSession.getVideoMode(), .01f, 100f);
- At the end of
onSurfaceChanged
, callonConfigurationChanged
from theSampleAppRenderer
:
// RenderingPrimitives to be updated when some rendering change is done
mSampleAppRenderer.onConfigurationChanged();
- Replace the renderFrame call with
mSampleAppRenderer.render()
inonDrawFrame
method:
// Call our function to render content:
renderFrame();
// Call our function to render content from SampleAppRenderer class
mSampleAppRenderer.render();
- Change
renderFrame
method for it to use a State and a projectionMatrix:
private void renderFrame()
// The render function called from SampleAppRendering by using RenderingPrimitives views.
// The state is owned by SampleAppRenderer which is controlling it's lifecycle.
// State should not be cached outside this method.
public void renderFrame(State state, float[] projectionMatrix)
- Change
Renderer.getInstance().drawVideoBackground()
tomSampleAppRenderer.renderVideoBackground()
and removeState state = Renderer.getInstance().begin():
// Clear color and depth buffer
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// Get the state from Vuforia and mark the beginning of a rendering section
State state = Renderer.getInstance().begin();
// Explicitly render the Video Background
Renderer.getInstance().drawVideoBackground();
// Renders video background replacing Renderer.DrawVideoBackground()
mSampleAppRenderer.renderVideoBackground();
- Remove viewport usage:
// Set the viewport
int[] viewport = vuforiaAppSession.getViewport();
GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);
- Use provided projection matrix instead of getting it from
vuforiaAppSession
:
Matrix.multiplyMM(modelViewProjection,
0, vuforiaAppSession.getProjectionMatrix().getData(),
0, modelViewMatrix, 0);
Matrix.multiplyMM(modelViewProjection, 0, projectionMatrix, 0, modelViewMatrix, 0);
- (The remaining steps are for
SampleApplicationSession)
. RemovemScreenWidth, mScreenHeight, mProjectionMatrix, mViewport
andmIsPortrait
and all their usages. - Remove
updateActivityOrientation(), setProjectionMatrix(), getProjectionMatrix(), getViewport(), configureVideoBackground()
andstoreScreenDimensions()
.