# Tracking Calibration and Fine Tuning

### Understanding the Axes of Reality <a href="#trackingcalibrationandfinetuning-understandingtheaxesofreality" id="trackingcalibrationandfinetuning-understandingtheaxesofreality"></a>

Different tracking devices offer different approaches to the axes system. Here you can see the axes of Reality:

<div align="left"><img src="/files/NiYkNZYcKPkl2mESklzt" alt=""></div>

On all <mark style="color:red;">**`Transform`**</mark> properties of nodes, you should see that the values are changing according to the setup shown in the image above. And remember that these values are by default and especially X and Y axis values need to be properly determined if you are looking at a different angle and a different Pan value in the set.

* **Depth:** The X-axis of Reality&#x20;
* **Width:** The Y-axis of Reality
* **Height:** The Z-axis of Reality

<div align="left"><img src="/files/mPZTSns6HfeBYeFjWa0M" alt=""></div>

### Choosing the Right Tracking Method <a href="#trackingcalibrationandfinetuning-choosingtherighttrackingmethod" id="trackingcalibrationandfinetuning-choosingtherighttrackingmethod"></a>

Whether we want to achieve a Virtual Studio or an Augmented Reality studio, we need to know where our real-world camera sensor is. Not only the position, but we also need to know the pan, till, and roll values which are listed below:

| Icon                             | Known As | In the Reality World |
| -------------------------------- | -------- | -------------------- |
| ![](/files/NgOAaAhCCOHFwyGORIal) | Roll     | Roll                 |
| ![](/files/hnJKOmEFf0vgCD1AvXQb) | Yaw      | Pan                  |
| ![](/files/wO52FFu9EchqbRhjjjoS) | Pitch    | Tilt                 |

* In addition to the position, pan, tilt, and roll values we need to know the **Field of View** and the **Distortion** of the lens. This is called camera tracking.

There are 2 main methods of camera tracking:

1. Mechanical sensor tracking
2. Image tracking

But whatever the tracking method is, the tracking device transfers the data from the serial port or network or both.

### Tracking System Lens Calibration <a href="#trackingcalibrationandfinetuning-trackingsystemlenscalibration" id="trackingcalibrationandfinetuning-trackingsystemlenscalibration"></a>

Some tracking systems also can supply the lens Field of View data with the protocol. Such tracking system vendors calibrate the lens data themselves, so Reality cannot be involved in this lens calibration process. But if a tracking protocol doesn’t supply final lens data, Reality can use zoom/focus encoder values coming through the tracking protocol and supply its lens calibration values.

{% hint style="info" %}
Data coming from the mechanical encoders are integer values. Reality tracking nodes map these raw encoder values to a floating-point between 0 and 1.
{% endhint %}

{% hint style="info" %}
If you don’t have a tracking device available during your tests, you can still use the <mark style="color:yellow;">**`UserTrack`**</mark> node for sending test tracking data to other nodes.
{% endhint %}

### Lens Center Shift <a href="#trackingcalibrationandfinetuning-lenscentershift" id="trackingcalibrationandfinetuning-lenscentershift"></a>

{% hint style="info" %}
Remember that Lens Center Shift must be applied if your tracking solution provider does not provide the lens center shift calibration or either even if they provide it, you would want to override these values.
{% endhint %}

Whenever a lens is mounted onto a camera body, the center of the lens cannot be aligned perfectly to the center of the image sensor. As a result, during zoom in/out the camera shifts the image a few pixels left & right or up & down.&#x20;

To compensate for the differences, in later steps we will find this shift amount using the trial and error method.

<div align="left"><img src="/files/zyEj0PG573PhwLQuq7aL" alt="Lens Node"></div>

### Calibrating Lens Center Shift <a href="#trackingcalibrationandfinetuning-calibratinglenscentershift" id="trackingcalibrationandfinetuning-calibratinglenscentershift"></a>

{% hint style="info" %}
Before you begin to proceed, make sure that your physical camera's pan & tilt is locked.&#x20;
{% endhint %}

Lens center shift occurs when you change the camera body or lens, and it requires calibration (offsetting) at various levels. &#x20;

To do that:

<div align="left"><figure><img src="/files/lABDEqbDzxYHTlu3N5Vy" alt=""><figcaption><p>Physical Cube inside the cyclorama</p></figcaption></figure></div>

* Place a physical box inside your studio. The box can be a rectangular prism such as a cardboard box, as shown in the image above.&#x20;
* Lock your physical camera to full zoom by aiming at the bottom left corner of the box

{% hint style="info" %}
To distinguish better, we recommend you use the bottom left corner of the box for aiming. You can also use different corners.
{% endhint %}

{% file src="/files/tEGtp4juqlNGnAp6SvQh" %}
HD (1920x1080) Crosshair Overlay Image&#x20;
{% endfile %}

{% file src="/files/xTnvvC89lovvALl14gEW" %}
HD (3840x2160) Crosshair Overlay Image
{% endfile %}

* Download one of the crosshair images above in accordance with your workflow and add it to your Asset folder.
* Launch Reality Editor

<div align="left"><figure><img src="/files/pcd53sluXVqZg471PZVr" alt=""><figcaption><p>New Project Category Selection</p></figcaption></figure></div>

* Select the Virtual Studio and click Next

<div align="left"><figure><img src="/files/dYncxjFvoOhLQDtfKpU8" alt=""><figcaption><p>Template Selection Menu</p></figcaption></figure></div>

* Select the Blank project and click Next

<div align="left"><figure><img src="/files/1rQL2igU11CEKdTTbeCF" alt=""><figcaption><p>World Outliner</p></figcaption></figure></div>

* Delete everything except for the Atmospheric Fog and Light Source actors
* Switch to RealityHub
* Activate the Nodegraph/Action module

<div align="left"><figure><img src="/files/20e9tyTyzliQnafGpGYA" alt=""><figcaption><p>Composite Augmented No Shadow RGraph Template </p></figcaption></figure></div>

<figure><img src="/files/tN5HutXLeoFnaFxle0Hl" alt=""><figcaption><p>Composite Augmented No Shadow RGraph Template Nodes</p></figcaption></figure>

* Select the **Composite Augmented No Shadow** RGraph Template via Nodegraph Menu

<div align="left"><figure><img src="/files/mxyaKYCQ55R9S49YPqtq" alt=""><figcaption><p>Node tree</p></figcaption></figure></div>

* Create <mark style="color:yellow;">**`MediaInput`**</mark> and <mark style="color:yellow;">**`Mixer`**</mark> nodes
* Select the <mark style="color:yellow;">**`MediaInput`**</mark> node, expand the <mark style="color:blue;">**`File`**</mark> property group, click on the folder icon of the <mark style="color:red;">**`File Path`**</mark> property, navigate the crosshair image via Asset Browser
* Connect the <mark style="color:green;">**`Output`**</mark> pin of the <mark style="color:yellow;">**`MediaInput`**</mark> node to <mark style="color:green;">**`Overlay`**</mark> input of the <mark style="color:yellow;">**`Mixer`**</mark> node
* Connect the <mark style="color:green;">**`Output`**</mark> pin of the <mark style="color:yellow;">**`CompositePasses`**</mark> node to <mark style="color:green;">**`Channel1`**</mark> input of the <mark style="color:yellow;">**`Mixer`**</mark> node
* Connect the <mark style="color:green;">**`Program`**</mark> output pin of the <mark style="color:yellow;">**`Mixer`**</mark> node to <mark style="color:green;">**`Video`**</mark> input of the <mark style="color:yellow;">**`AJAOut`**</mark> node
* Connect the <mark style="color:green;">**`Program`**</mark> output pin of the <mark style="color:yellow;">**`Mixer`**</mark> node to <mark style="color:green;">**`Display`**</mark> input of the <mark style="color:yellow;">**`EngineControl`**</mark> node as shown in the image above.

<div align="left"><figure><img src="/files/ymGqGpoWXGok3v4R3L9v" alt=""><figcaption><p>Advanced Preview Monitor View</p></figcaption></figure></div>

* Activate the Advanced Preview Monitor (APM)

<div align="left"><figure><img src="/files/xZBQaaQzRATBlAMqidIE" alt=""><figcaption><p>Dummy Cube &#x26; Physical Cube</p></figcaption></figure></div>

* Add a <mark style="color:yellow;">**`DummyCube`**</mark> actor to the nodegraph canvas and change its position to be aligned right next to a physical box as shown in the image above.

<div align="left"><figure><img src="/files/w260TcbDtX526MXZvQh8" alt=""><figcaption><p><mark style="color:yellow;"><strong><code>Lens</code></strong></mark> Node </p></figcaption></figure></div>

* Fully zoom out with the physical camera. If the center shifts, the <mark style="color:yellow;">**`DummyCube`**</mark> will drift away. In such a scenario, you need to adjust the center shift values via <mark style="color:red;">**`Center Shift X`**</mark> & <mark style="color:red;">**`Center Shift Y`**</mark><mark style="color:yellow;">**`Lens`**</mark> node.
* Zoom in and zoom out while iterating the shift values until the shift in <mark style="color:yellow;">**`DummyCube`**</mark>'s position is very small.

### Setting the Right Tracking Parameters <a href="#trackingcalibrationandfinetuning-settingtherighttrackingparameters" id="trackingcalibrationandfinetuning-settingtherighttrackingparameters"></a>

Setting the right tracking parameters will prevent sliding between real ground and virtual ground. To achieve this, follow the steps below:

* If your tracking system is not sending Focal Distance information, optionally you can add a <mark style="color:yellow;">**`Lens`**</mark> node to your nodegraph canvas to read the Focal Distance information via a lens file.&#x20;

The <mark style="color:yellow;">**`Lens`**</mark> node should look like this:&#x20;

<div align="left"><figure><img src="/files/C6PcEnQmTdKayQDvXN9P" alt=""><figcaption></figcaption></figure></div>

* If you have a <mark style="color:yellow;">**`Lens`**</mark> node in your nodegraph and the <mark style="color:red;">**`Lens`**</mark> list does not include the lens you are using, please choose the <mark style="color:red;">**`Lens`**</mark> which has the closest <mark style="color:red;">**`Focal Distance`**</mark> to the lens you have:

<div align="left"><img src="/files/tN18Qbyz8H0RS2GEAmtz" alt=""></div>

* **(FOR SYSTEMS WITH STYPE)** Go to <mark style="color:yellow;">**`Stype`**</mark> node on the nodegraph and make sure that <mark style="color:red;">**`Nodal Offset`**</mark> is 0 as shown below:

<div align="left"><img src="/files/a9Q6ThPqaTWbyETTS8Hb" alt=""></div>

{% hint style="info" %}
If the lens has been taken out and put back which means there is a physical change in the location of the lens, the Lens Center Shift Calibration must be done again which has explained above.&#x20;
{% endhint %}

{% hint style="warning" %}
If you are using a PTZ head, you might need to modify <mark style="color:red;">**`Device Transform`**</mark> of the track node regarding the axes of Reality. Consult your tracking solutions provider for measurements and transform values related to the zero pivot point of the camera and enter these values to <mark style="color:red;">**`Device Transform`**</mark> as shown below. Remember that especially the accuracy of PTZ values which are RYP values in Reality are particularly important for seamless calibration.
{% endhint %}

<div align="left"><img src="/files/sYpkgUjF073ZtLOD13Ei" alt=""></div>

In Reality Editor, check if the floor offset is higher than zero (0).

<div align="left"><img src="/files/HHhTwXYlETrEPnAkyRmv" alt=""></div>

For example, in this project, floor of the set we use is 30cm higher than Z-axis. In this case, the <mark style="color:red;">**`Bottom`**</mark> of the <mark style="color:yellow;">**`Projection Cube`**</mark> in setup won’t be visible as shown below:

<div align="left"><img src="/files/1PYfHdZek3b0jleBQThX" alt=""></div>

The Bottom of the Projection Cube is not visible when it is set to 0.

<div align="left"><img src="/files/iStuAqdcEDBtGJbe9jJN" alt=""></div>

The Bottom of the Projection Cube is conflicting with the graphics when it is set to 30 and is fully visible when it is set to 31. What that means is, graphics are approximately 30cm higher than Z Axis. The right way to make this setup work is taking the graphics 31 cm down by changing the <mark style="color:red;">**`Local Transform`**</mark> of our <mark style="color:yellow;">**`Custom Actor`**</mark> by 31 cm.

<div align="left"><img src="/files/YBKZSKt3781Wg9uGhHqO" alt=""></div>

Setting the <mark style="color:red;">**`Z`**</mark> transform parameter of the Custom Actors. After setting these parameters correctly, the projects tracking should work fine. Remember that all the <mark style="color:yellow;">**`Custom Actor`**</mark> which is connected to the <mark style="color:yellow;">**`Track Modifier`**</mark> should have the same <mark style="color:red;">**`Z`**</mark> transform value. Don’t forget to check these parameters if any of the <mark style="color:yellow;">**`Custom Actor`**</mark> or <mark style="color:yellow;">**`Track Modifier`**</mark> are changed.

### Mapping Pan/Tilt/Roll Coordinates <a href="#trackingcalibrationandfinetuning-mappingpan-tilt-rollcoordinates" id="trackingcalibrationandfinetuning-mappingpan-tilt-rollcoordinates"></a>

There might be situations where your raw tracking data might provide different data coordinates for Pan, Tilt and Roll. As an example, Reality may receive Pan data instead of Tilt and Tilt instead of Pan on the Tracking node. In order to swap these coordinates, in Reality Control application, on all the tracking nodes (such as FreeD, MoSys, Stype, UserTrack etc.), there is a new property option <mark style="color:red;">**`Pan/Tilt/Roll`**</mark> under <mark style="color:blue;">**`Transform Mapping`**</mark> property group where you can use various mapping methods.

<div align="left"><img src="/files/R0KvS1jhppFyfkE1LQuD" alt=""></div>

The 6 possible permutation options are available which can be selected for mapping as per your tracking data sent to Reality Engine. In the below example, you can see the behavior of **PTR** and **TPR** mapping. Pan and Tilt coordinates are swapped.

<div align="left"><img src="/files/hREk7QvKiGsM7sOTbe2k" alt=""></div>

Terminology of the coordinates for Pan/Tilt/Roll:

| Coordinate | Reality Property | Editor Terminology |
| ---------- | ---------------- | ------------------ |
| P (Pan)    | Y                | Yaw                |
| T (Tilt)   | P                | Pitch              |
| R (Roll)   | R                | Roll               |


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://zerodensitydocumentation.gitbook.io/docs/reality-4.27/reality-4.27/reality-walkthrough/tracking-calibration-and-fine-tuning.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
