# Tracked Billboard

In this section, you'll discover how to use the Tracked Billboard pipeline with Reality 5.4. For our demonstration, we will employ [TRAXIS Talent Tracking](https://docs.traxis.ai/docs/v/talent-tracking-2.0/) as a provider.

### What Is Tracked Billboard?

Tracked Billboard enables you to seamlessly integrate your keyed talent into the Reality 5.4 scene, achieving a photorealistic appearance. Furthermore, it can be utilized in a variety of creative scenarios, including Teleportation and Fly Cam.

### Process

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/7wSHBY58pGVarHbVAqTc/image.png" alt=""><figcaption><p>Reality Hub Login Screen</p></figcaption></figure></div>

* Login to Reality Hub.
* Activate the [Launcher](https://zerodensitydocumentation.gitbook.io/docs/reality-5.4/reality-5.4/user-guide/launcher) module and launch your project.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/fLwailpwm7AIBfJCeb8G/image.png" alt=""><figcaption><p>Dragging &#x26; Dropping Engine into Nodegraph Canvas</p></figcaption></figure></div>

* Go to the **Engines** section, drag and drop the <mark style="color:yellow;">**`UE5`**</mark> process into Nodegraph canvas.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/MEgTKWzgI2awt9ovVu9F/image.png" alt=""><figcaption><p>Node Details Panel</p></figcaption></figure></div>

* Go to the [Node Details Panel](https://zerodensitydocumentation.gitbook.io/docs/reality-5.4/reality-5.4/user-guide/nodegraph-actions/nodegraph/node-details-panel).
* Click on the **Spawn Reality Camera** and **Spawn Reality Tracked Billboard** [function](https://zerodensitydocumentation.gitbook.io/docs/reality-5.4/reality-5.4/user-guide/nodegraph-actions/nodegraph/node-details-panel/function) buttons.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/FQwLP8B3KXIuM90l4XGT/image.png" alt=""><figcaption></figcaption></figure></div>

As soon as you click on abovementioned function buttons, following changes occurs as illustrated above:

1. Reality Actors folder has been created.
2. New Properties has been added.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/J7MtIIxhG5p0xqYiyS8D/image.png" alt=""><figcaption><p><mark style="color:yellow;"><strong><code>AJAIn</code></strong></mark> and <mark style="color:yellow;"><strong><code>AJAOut</code></strong></mark> Nodes</p></figcaption></figure></div>

* Create <mark style="color:yellow;">**`AJAIn`**</mark> and <mark style="color:yellow;">**`AJAOut`**</mark> nodes, then designate your reference by right-clicking on each node. In our example, we have selected <mark style="color:green;">**`SingleLink 1`**</mark> for <mark style="color:yellow;">**`AJAIn`**</mark> and <mark style="color:green;">**`SingleLink 4`**</mark> for <mark style="color:yellow;">**`AJAOut`**</mark>.
* Establish your corresponding camera tracking node. In our example, we utilized <mark style="color:yellow;">**`Xync`**</mark>.
* Connect the <mark style="color:green;">**`SingleLink 1`**</mark> output pin of the <mark style="color:yellow;">**`AJAIn`**</mark> node to the <mark style="color:green;">**`Sync`**</mark> input pin to the <mark style="color:yellow;">**`Xync`**</mark> node.

### Breaking Track&#x20;

Breaking Track is a procedure for extracting specific output from a <mark style="color:yellow;">**`Track`**</mark> node. For instance, you might prefer to use only the <mark style="color:red;">**`Location`**</mark> data from your talent tracking software/hardware or on a focus property of <mark style="color:green;">**`Lens Distortion`**</mark> data for your pipeline. This approach also allows you to conduct advanced compositing operations.

Now:&#x20;

* Create a <mark style="color:yellow;">**`Break Track`**</mark> node.
* Connect the <mark style="color:green;">**`Track`**</mark> output of the <mark style="color:yellow;">**`Xync`**</mark> node to the <mark style="color:green;">**`Input`**</mark> input pin of the <mark style="color:yellow;">**`Break Track`**</mark> node.

### Cyclorama Connections&#x20;

Now that we have AJA nodes for video I/O, talent tracking, camera tracking, and Reality 5.2 nodes, as illustrated in the previous image, it's time to incorporate Cyclorama.

* Add a <mark style="color:yellow;">**`Cyclorama`**</mark> node to the canvas.

#### Distorting and Undistorting

Most camera lenses exhibit some level of optical distortion, and as a result, you often need to perform undistortion and then distortion operations on your video source to address this issue.&#x20;

For instance, when dealing with the image from your studio and the 3D Cyclorama mesh you've created inside Reality 5.2, they may not align perfectly in terms of lens distortion. To ensure proper alignment between the Cyclorama and camera image, it's essential to employ the undistortion process.

* Create two <mark style="color:yellow;">**`Distort`**</mark> nodes and one <mark style="color:yellow;">**`Undistort`**</mark> node.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/1czjanLv6KqnXbdfqHIW/image.png" alt=""><figcaption><p>Show as Input</p></figcaption></figure></div>

* Go to the **Properties** section of each node you created.
* Right-click the <mark style="color:red;">**`Distortion`**</mark> property for each node and select "**Show as Input**".

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/LEgWosDsEiJto5HtAOvp/image.png" alt=""><figcaption><p>Cyclorama Connections</p></figcaption></figure></div>

* Connect the <mark style="color:green;">**`Lens Distortion`**</mark> output of the <mark style="color:yellow;">**`Break Track`**</mark> node to the <mark style="color:green;">**`Distortion`**</mark> input pin of the <mark style="color:yellow;">**`Distort`**</mark> and <mark style="color:yellow;">**`Undistort`**</mark> nodes.
* Connect the <mark style="color:green;">**`SingleLink 1`**</mark> output pin of the <mark style="color:yellow;">**`AJAIn`**</mark> node to the <mark style="color:green;">**`In`**</mark> input pin to the <mark style="color:yellow;">**`Undistort`**</mark> node.&#x20;
* Connect the <mark style="color:yellow;">**`AJAIn`**</mark> node's <mark style="color:green;">**`SingleLink 1`**</mark> output to <mark style="color:green;">**`Track`**</mark> input of the <mark style="color:yellow;">**`Cyclorama`**</mark> node.
* Connect the <mark style="color:yellow;">**`Undistort`**</mark> node's <mark style="color:green;">**`Out`**</mark> output pin to <mark style="color:yellow;">**`Cyclorama`**</mark> node's <mark style="color:green;">**`Video`**</mark> input.
* Connect the <mark style="color:yellow;">**`Cyclorama`**</mark> node's <mark style="color:green;">**`Render`**</mark> output to first <mark style="color:yellow;">**`Distort`**</mark> node's <mark style="color:green;">**`In`**</mark> input pin.
* Connect the <mark style="color:yellow;">**`Cyclorama`**</mark> node's <mark style="color:green;">**`Mask`**</mark> output pin to the second <mark style="color:yellow;">**`Distort`**</mark> node's <mark style="color:green;">**`In`**</mark> input pin.

### Adding RealityKeyer

Tracked Billboard pipeline requires keyed image of your talent; therefore, you have to use Reality Keyer.

<div align="left"><figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/VymFzfgjHc1TNo64s7Uf/image.png" alt=""><figcaption><p>RealityKeyer Connections</p></figcaption></figure></div>

* Create a <mark style="color:yellow;">**`RealityKeyer`**</mark> node.
* Connect the <mark style="color:green;">**`Out`**</mark> output pin of the first <mark style="color:yellow;">**`Distort`**</mark> node to the <mark style="color:green;">**`Clean Plate`**</mark> input of the <mark style="color:yellow;">**`RealityKeyer`**</mark> node.&#x20;
* Connect the <mark style="color:green;">**`Out`**</mark> output pin of the second <mark style="color:yellow;">**`Distort`**</mark> node to the <mark style="color:green;">**`Clean Plate Mask`**</mark> the input of the <mark style="color:yellow;">**`RealityKeyer`**</mark> node. Connect the <mark style="color:green;">**`SingleLink 1`**</mark> output pin to the <mark style="color:green;">**`Input`**</mark> input pin of the <mark style="color:yellow;">**`RealityKeyer`**</mark> node.
* Create a second <mark style="color:yellow;">**`Undistort`**</mark> node, right-click on its <mark style="color:red;">**`Distortion`**</mark> property and select "**Show as Input**".
* Connect the <mark style="color:green;">**`Output`**</mark> output pin of the <mark style="color:yellow;">**`RealityKeyer`**</mark> node to the <mark style="color:green;">**`In`**</mark> input of the second <mark style="color:yellow;">**`Undistort`**</mark> node.

### Talent Location

In order to place your talent inside the Reality 5.4 scene, you have to provide talent location data via a tracking system. In our example we utilized TRAXIS Talent Tracker.

<figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/umFpCei6yJxD7bHTu5XU/image.png" alt=""><figcaption><p>Talent Data Connections</p></figcaption></figure>

* Create a corresponding talent tracking data node. In our example we use <mark style="color:yellow;">**`FreeD`**</mark> node.
* Add the second <mark style="color:yellow;">**`Break Track`**</mark> node.
* Create a <mark style="color:yellow;">**`Cast vec3 to vec3d`**</mark> node.
* Connect the <mark style="color:green;">**`Location`**</mark> output pin of the second <mark style="color:yellow;">**`Break Track`**</mark> node to <mark style="color:green;">**`vec3`**</mark> input pin of the <mark style="color:yellow;">**`Cast vec3 to vec3d`**</mark> node.

### Engine Connections

<figure><img src="https://content.gitbook.com/content/oKRkWKIoT5UAB0ERhlgD/blobs/8CeMqtmaFctllpGTvQQI/image.png" alt=""><figcaption><p>Engine Connections</p></figcaption></figure>

* Connect the <mark style="color:green;">**`vec3d`**</mark> output pin of the <mark style="color:yellow;">**`Cast vec3 to vec3d`**</mark> node to <mark style="color:yellow;">**`UE5`**</mark> node's <mark style="color:green;">**`Reality Tracked Billboard Talent Location`**</mark> input pin.
* &#x20;Connect the second <mark style="color:yellow;">**`Undistort`**</mark> node's <mark style="color:green;">**`Out`**</mark> output pin to <mark style="color:green;">**`Reality Track Billboard Video`**</mark> input of the <mark style="color:yellow;">**`UE5`**</mark> node.
* Connect the <mark style="color:yellow;">**`Xync`**</mark> node's <mark style="color:green;">**`Track`**</mark> output pin to the <mark style="color:green;">**`Reality Camera Track`**</mark> input pin of the <mark style="color:yellow;">**`UE5`**</mark> node.
* Create a <mark style="color:yellow;">**`Composite Passes`**</mark> node.&#x20;
* Connect <mark style="color:yellow;">**`UE5`**</mark> node's <mark style="color:green;">**`Reality Camera Scene`**</mark> output pin to <mark style="color:yellow;">**`Composite Passes`**</mark> node's <mark style="color:green;">**`Render`**</mark> input.
* Connect the <mark style="color:yellow;">**`Xync`**</mark> node's <mark style="color:green;">**`Track`**</mark> output to Track input of the Composite Passes node.
* Connect the <mark style="color:green;">**`Output`**</mark> output pin of the <mark style="color:yellow;">**`RealityKeyer`**</mark> to the <mark style="color:green;">**`Video`**</mark> input of the <mark style="color:yellow;">**`Composite Passes`**</mark> node.
* Connect the <mark style="color:green;">**`Output`**</mark> pin of the <mark style="color:yellow;">**`Composite Passes`**</mark> node to <mark style="color:green;">**`SingleLink 4`**</mark> input pin of the <mark style="color:yellow;">**`AJAOut`**</mark> node.
