LogoLogo
Reality 4.27
Reality 4.27
  • Reality 4.27
    • What's New
    • Introduction to Reality
    • Reality Walkthrough
      • Reality Examples Project
      • Tracking Calibration and Fine Tuning
      • Real-Time Ray Tracing
      • Raw Input
      • Augmented Reality Virtual Studio
      • Green Screen Virtual Studio
      • Setting up Realtime Video I/O
      • Projection Cube
      • Cyclorama Setup
      • Reality Keyer and Fine Tuning
        • Keyer FOV Softness
    • Reality Editor Guide
      • How Reality Editor Works?
      • Exporting UE4 Actors to RealityHub API
        • ZDActor Component
        • Exporting Blueprint Functions
        • Texture Pins on ZDActors
        • Exporting Properties
          • Directly Readable Variable Types
          • Property Update Callbacks
            • OnChanged_Event Callback
            • OnLengthChanged_Event Callback
            • OnChanged_ Function Callback
            • OnLengthChanged_ Function Callback
      • Exporting Assets to Reality API
        • Exporting Actors
        • Exporting Custom Blueprint Node
        • Exporting Custom Material Node
        • Exporting Lens Calibration
        • Exporting Post Process Material
        • Exporting Projection Material
        • Exporting Static Mesh
        • Exporting Unreal Motion Graphics
        • Exporting Unreal Texture Assets
      • Reality Custom Material Nodes
        • Defining the Variables in Material Definition
        • Custom Material Node
      • Using UMG in Reality
        • How to Use UMG with Reality
        • UMG Animations
        • Texture Brush
        • Asset Brush
        • Exporting Custom Events in UMG to Reality API
      • Additional ZD Components
        • ZDPieChart
        • ZDText
      • Helpful Links
      • Reality Editor Settings
        • Enabling DX12 and Ray Tracing
        • Hardware Accelerated Video Decoding
        • Enabling HAP Codec
    • Reality Setup Application
      • Audio
        • Audio Delay Node
        • Virtual Set Audio
        • Audio Mixer
      • Bypass Post Process
      • Fill and Key
        • Fill and Key using Dual-link
        • Fill and Key using Quad-link
      • Crop Function
      • Level Sequencer
      • Color Matrix Node
      • Media Playback
      • Preset for H.264 in Adobe Media Encoder
      • HAP Video Codec
      • Flow Control
      • GPUDirect Technology
    • Installation and Configuration Guide
      • Hardware Platforms
        • Legacy HP Z4G4 Configuration
          • Fan and Front Card Kit
          • Certified Power Supply
          • Video I/O Card Installation
            • Mellanox ConnectX®-5 Card Installation
            • AJA Video I/O Card Installation
          • GPU installation
            • Certified GPUs
            • Connecting GPU to 1000 watts power supply
              • GPU RTX 6000 installation (8+6 pin)
            • Connecting GPU to 750 watts power supply
              • GPU RTX 6000 installation (8+6 pin) Needs 6 to 8 Pin Adapter
          • Installing RAM
          • HP Performance Advisor
      • Adding Reality User to Windows
      • Downloading Distributions
      • Installing Steps
        • Upgrading
        • Installation
      • Configuration Steps
        • NVIDIA Configuration
        • AJA Video Card Configuration
        • R Drive Mapping
        • Internet Options
      • Updating RealityEngine AMPERE
    • Licensing Guide
      • Hardware Licensing
    • Reality Tutorials and Examples
      • Reality Editor Tutorials
        • Open/Close Door
        • Exporting UE Actor to ZD API (a.k.a ZDActor)
        • Custom Blueprint Node Tutorials
          • Using Event Tick Functions and Properties
          • Adding Custom Blueprint Nodes
        • Redirection of the Reality Engine Content
        • Creating Dynamic Materials
        • Changing ZDText with Dynamic String
        • Activating Destructible Mesh
        • Reality Texture Input
        • Activating Fire During Runtime
        • Asset Migration to Another Project
        • Rotating Door by Using Function Parameters
        • Creating Custom Cyclorama
      • Nodegraph Tutorials
        • Portal Window Setup
          • Adding Clipping Plane
          • How to Position Videowall
          • Setting Portal Window and Calibration
          • Creating the Augmented Part
    • Appendices
      • A - Camera Tracking and Lens Calibration
        • Camera Tracking Information
        • Tracking Device, Physical Connections
        • Lens Calibration
      • B - Standard Unreal Engine Post Processing Pipeline
      • C - Reality Keyer
        • What is Image Based Keying?
        • Reality Keyer Pipeline
        • Physical Factors Affecting the Key
      • D - Viewing Videomask
      • E - CLASS setting for Broadcast Cameras
      • F - ZD Blueprint Node Pin Reference
      • G - Utilizing TRAXIS talentS FreeD Data
    • Knowledge Base
    • Known Issues and Limitations
Powered by GitBook
On this page
  • Working with Raw Video Input in Reality
  • Creating Basic Raw Input and Output Pipeline
  1. Reality 4.27
  2. Reality Walkthrough

Raw Input

PreviousReal-Time Ray TracingNextAugmented Reality Virtual Studio

Last updated 2 years ago

Raw Data is sensor data before any image processing, which is called Bayer pattern/image. The benefit of Raw data, in Reality, is that uncompressed data captured from a camera is then de-Bayered in Reality using the algorithm to provide high-quality video with superior signal for achieving good quality keyer.

In a single-sensor camera, color is produced by filtering each photosite (or pixel) to produce either red, green or blue values most often used is the Bayer pattern. Because each pixel contains only one color value, raw isn’t viewable on a monitor in any discernible way. This means that raw isn’t video.

Raw has to be converted to video for viewing and use. This is usually done through a de-Bayer process, which determines both color and brightness for each finished pixel in the image. The upside to raw video is that no typical video processing has been baked in. The sensor outputs exactly what it sees - no white balance, ISO or other color adjustments are there. This, along with a high bit rate, allows for a huge amount of adjustment possible in Reality.

Working with Raw Video Input in Reality

Before we begin, let's first understand that not all the cameras provide Raw data. There are specific cameras in the market which will deliver Raw, one of them is Panasonic VariCam. The Reality now provides support for VariCam's Raw data to use in the live video production making the keying look more better than ever.

To setup Raw input in Reality, follow the steps below:

Creating Basic Raw Input and Output Pipeline

Launch the project that you intend to work using Raw input.

In the Setup tab, on the node graph add the Nodes (Raw data processing nodes) mentioned below in the same order (1-2 and 3).

  1. Delog node. Go to Experimental > Varicam

  2. DebayerSource node. Go to Experimental

  3. Matrix-Drangescale node. Go to Experimental > Varicam

These 3 nodes will do all the image processing and de-Bayer/de-mosaic the raw input data.

For video input, add AJACard node then add AJAInRGB node. Connect AJACard Device output pin to AJAInRGB node's Device input pin.

For Raw input, a new AJAInRGB node must used. Go to Experimental > Media I/O. This is the point where raw data from the camera is accessed.

Now add 3 new nodes (in the order as shown above) which will do all the image processing and de-Bayer/de-mosaic the raw input data.

Connect AJAInRGB Device > VaricamDelog Input pin.

Then connect VaricamDelog Output > DebayerSource Input pin.

Then connect DebayerSource Output > Matrix-Drangescale Input pin and finally Matrix-Drangescale Output pin to Mixer node or any other node in the compositing pipeline depending on what you would like to compose.

Output from Matrix-Drangescale finally gives the processed de-Bayered/de-mosaic output which can be viewed on any SDI or HDMI monitors just like any other non raw video source.

The below image shows Raw data processing nodes from input to output. In other words, simply assume that Video coming Matrix-Drangescale Output pin is from AJA's input. And the output from Matrix-Drangescale Output pin goes to other the input pin of other nodes.

The below image shows Raw data processing nodes from input to output with the use of other nodes, in this example, it is composited with Cyclorama for keyer.