Display Calibration

Calibration is the process of aligning WATCHOUT's rendered output to the physical reality of the display surface. This encompasses projector alignment for 3D mapping, camera-based calibration via NDI streams, EDID management for display identification, and external calibration integration through the HTTP API.

NDI Calibration Stream

For camera-based calibration workflows, each GPU display can be assigned an NDI Calibration Stream in the Calibration section of Device Properties. This setting specifies the name of an NDI video stream that carries a live camera feed of the display surface.

When configured, the Runner can receive the NDI stream and use it as a reference input for alignment. This is typically used in automated or semi-automated calibration systems where a camera observes the projected output and provides feedback for geometric correction.

To set up an NDI calibration stream:

  1. Select the display in Device Properties.
  2. Open the Calibration section.
  3. Enter the NDI stream name in the Calibration Stream field.

The stream name should match the NDI source name exactly as it appears on the network. The calibration system uses this stream to compare the expected output pattern with the observed result on the physical surface.

Projector Calibration (3D Mapping)

For 3D projector displays, WATCHOUT provides a dedicated calibration system that uses point correspondences to compute the projector's position, orientation, and lens parameters. This is essential for accurate projection mapping where content must align precisely to a physical 3D model.

Error: 0.0
Virtual Point (on 3D model) Reality Point (drag to calibrate) Correspondence

End-to-End 3D Mapping Workflow

The full 3D mapping process follows four steps:

  1. Position real — set up your projector(s) and the object(s) you want to project on in the real world.
  2. Prepare virtual — create a matching virtual scene in Producer by adding and positioning 3D models and projectors to match the physical setup.
  3. Calibrate — align the virtual scene to the real scene by placing and matching calibration points on the 3D model and the physical surface.
  4. Play media — project media onto the surface of the object(s).

The 3D model must closely match the real object's proportions. If the model and the physical object differ significantly, calibration will be very difficult or impossible.

Calibration UI Reference

The calibration toolbar provides the following controls:

#ControlDescription
1View / Calibration switchToggle between View mode (projector can be moved freely) and Calibration mode (calibration point editing enabled). The projector may only be moved in View mode, for instance using W/A/S/D or arrow keys.
2ResetReset the calibration to its initial state.
3Edit virtual pointsActivate virtual-point editing on the 3D model surface.
4Edit reality pointsActivate reality-point editing on the projector output.
5AddAdd a new calibration point.
6MoveMove an existing calibration point.
7RemoveRemove a calibration point.
8Vertex snappingEnable/disable snapping to nearby vertices on the 3D model when adding or moving virtual points.
9Link / Unlink projector-to-cameraWhen linked, the camera follows the projector's point of view. Unlink to navigate freely without moving the projector. When relinked, the camera moves to the projector's POV (not the other way around).
10Manual modeCalibration is only recalculated when the Calibrate button is pressed.
11Continuous modeCalibration recalculates automatically every time a reality point is moved.
12Velocity controlAdjust the movement speed for reality-point editing.
13RepositionMove all reality points on top of their corresponding virtual points.
14CalibrateTrigger calibration computation.
15AccuracyDisplays how well the calibration solution aligns virtual and reality points. 100% indicates perfect alignment.
16ExitExit projector/calibration mode.

Virtual Points and Reality Points

The calibration process works with two sets of points:

  • Virtual points are placed on the 3D model in the Stage view. They represent known locations on the surface where you want the projected image to land. Virtual points are defined in world coordinates (X, Y, Z).
  • Reality points are the corresponding positions on the projector's 2D output where those virtual-point locations actually appear when projected. Reality points are defined in normalized screen coordinates.

The calibration algorithm uses these point pairs to solve for the projector's intrinsic parameters (focal length, lens shift) and extrinsic parameters (position, orientation) using a camera calibration model.

Calibration Workflow

  1. Switch to Calibration mode — in the Stage toolbar, switch the projector view from View mode to Calibration mode. This enables the calibration point editing tools.
  2. Place virtual points — using the point tools in the Stage view, add virtual points on the 3D model at clearly identifiable surface locations (corners, edges, landmarks). You can add, move, and remove points using the toolbar actions.
  3. Place at least six points — the calibration algorithm requires a minimum of six virtual points before you can edit reality points. This minimum ensures the system has enough constraints to solve for all projector parameters. More points can be added to further constrain the mapping.

You need to create at least six virtual points to edit the reality points.

More points does not necessarily give a better result. Adding poorly placed or inaccurate points can degrade calibration quality. Focus on accurate placement of well-distributed points rather than maximizing count.

  1. Edit reality points — once six or more virtual points exist, switch to reality-point editing. For each virtual point, adjust the corresponding reality point to match where that location actually appears on the projector's output. The Stage view shows both sets of points for comparison.

Select a reality point by left-clicking on it (or close to it). You can also navigate between reality points using W/A/S/D keys or the left stick on a compatible gamepad (Xbox controller). Once selected, move the point using the arrow keys or the right stick of a gamepad. The goal is to make each reality point project onto the same physical location as its corresponding virtual point.

Gamepad support: A compatible Xbox/gamepad controller can be used to select and move calibration points. Use the left stick to cycle through points and the right stick to adjust the selected point's position. See Xbox Controller for details.

  1. Calibrate — trigger the calibration computation. The system solves for the projector parameters that best align the virtual and reality point pairs.

Continuous vs. Manual Calibration

The calibration toolbar provides two calibration behaviors:

  • Continuous calibration — the system recalculates the projector parameters automatically every time you move a point. This provides real-time feedback as you adjust reality points, making it easier to converge on an accurate alignment.

Only use Continuous mode for fine-tuning. If some reality points are still far from their correct locations, Continuous mode can cause severe jumps in the projector position, making calibration difficult to complete. Start with Manual mode to get an approximate alignment, then switch to Continuous for final adjustments.

  • Manual calibration — the system only recalculates when you explicitly press the Calibrate button. Use this when you want to adjust multiple points before triggering a recalculation, or when continuous recalculation is distracting.

Calibration Accuracy

After calibration, WATCHOUT displays an accuracy indicator that shows how well the computed projector model aligns the virtual and reality points. A low error value (reprojection error) indicates good alignment. An error above 100 indicates a significant problem — typically caused by incorrect point placement, insufficient point count, or a physical setup that doesn't match the model.

If the accuracy is poor, review the point placements and check for:

  • Points that are nearly coplanar (insufficient 3D variation)
  • Incorrectly matched virtual/reality pairs
  • Physical obstructions or distortions not captured in the model

Reposition Action

The Reposition action moves all reality points to sit directly on top of their corresponding virtual points in the current projector view. This is useful as a reset or starting point before manual fine-tuning — it gives you a clean baseline where both point sets overlap, and you can then adjust individual reality points to account for real-world discrepancies.

Projector Parameter Locking

During calibration, you can lock specific projector parameters to prevent the calibration algorithm from changing them:

  • Lock Lens Shift — prevents the calibration from adjusting the horizontal and vertical lens shift values. Use this when you know the lens shift setting from the projector's specification sheet and want to preserve it.
  • Lock Width / Distance Ratio — prevents the calibration from adjusting the throw ratio. Use this when the throw ratio is precisely known from the lens data.

Locking parameters reduces the degrees of freedom in the calibration solve, which can improve accuracy when the locked values are known to be correct, but can also degrade results if the locked values are wrong.

External Calibration Triggers

WATCHOUT supports an external calibration trigger mechanism that allows third-party calibration systems to put displays into calibration mode via the HTTP API. This is used in automated calibration workflows where an external system (such as VIOSO or other camera-based alignment tools) needs WATCHOUT to display calibration patterns while the external system captures and processes the result.

The trigger works through the Operative's HTTP input endpoint:

  • Endpoint: POST /v0/inputs on the Operative's external port
  • Input key: displaycalibration
  • Value: 1.0 to enter calibration mode, 0.0 to exit

A typical automated calibration sequence:

  1. The external system sends displaycalibration = 1.0 to put WATCHOUT displays into calibration mode.
  2. The external system runs its calibration process (projecting patterns, capturing camera images, computing corrections).
  3. The external system copies the resulting calibration data (e.g., MPCDI files) to the expected location.
  4. The external system sends displaycalibration = 0.0 to return WATCHOUT to normal operation.

This integration supports hardware trigger devices (such as Elgato Stream Deck) for operator-initiated recalibration in permanent installations.

Tip: Save a snapshot of the show file after successful calibration so you can revert if subsequent edits introduce problems. Treat calibration state as critical show data.