Expanding the horizons of visual production
Designed for versatility, our first camera tracking system is easily set up and adapted to almost any shooting situation. A multi-eye image sensor unit works with a processing box and lens encoder to integrate tracking data with camera and lens metadata, streamlining virtual production, AR, on-set preview, and post-production workflows.
Every element has been designed to work with a range of cameras, lenses and rigs, delivering highly accurate camera tracking—even when outdoors or working around obstacles.
Virtual Production
Perfectly suited to working with LED displays and green screens, marker-free tracking removes the need to set up tracking cameras or IR markers, removing reflections and limiting corrections in post.
Augmented Reality
Give viewers an immersive visual experience, with AR overlays applied in real time. Simply move the system to transform any studio into a virtual studio, and deliver everything from stats reviews to graphical displays—without relocating equipment or applying new markers.
On-set Preview
On-set preview provides immediate review of composited footage, on-set or outdoors. Adjustments can be made in near-real time, while tracking data can be recorded as FBX files onto a SDXC memory card UHS-II/UHS-I to streamline matching and finishing in post.
Post-Production
Record tracking information and lens data during shooting, along with camera motion data obtained through match-moving, to support post-production editing.
Sensor Unit
Our compact, lightweight Sensor Unit can be mounted in any position or orientation, thanks to the multi-eye image sensor. Adjust the position without using tools and attach with NATO rail (included). Synchronization across multi-camera productions is simple too, thanks to the gathering of camera metadata.
- Dimensions*: approx. W 86 mm × H 60 mm × D 43 mm
- Weight*: approx. 250 g
* Final specifications may differ.
Processing Box
With a single SDI cable, metadata and sync information can be acquired from the camera*. Screw holes on the front and rear surfaces allow for flexible mounting, while an OLED display makes it easy to check the status of camera tracking, lens data, and other information. The Processing Box is equipped with Genlock input, Timecode input, SDI input/output connectors, and Lens encoder connector.
* When using a Sony camera that can retrieve lens information.
Lens Encoder
Metadata can also be taken directly from the lens itself, in case the camera’s SDI output is not available, thanks to a Lens Encoder. A rotary mechanism detects the rotation angles of the zoom, focus, and iris rings, while the Lens Encoder passes this data to the Processing Box via LEMO 7-pin cable.
Multi-Eye Image Sensor
With sensors on five faces*, the Sensor Unit can capture multiple feature points at once, from a wide range. This allows for exceptionally stable performance, since if at least one image sensor in use captures valid feature points, tracking data can be extracted.
* Four out of the five sensors are selected for use.
Marker-free Tracking
Setup is simple and fast, as there is no need to set up IR markers or stationary cameras. The Sensor Unit will recognize feature points and use Visual SLAM (Simultaneous Localization and Mapping) technology to capture location data—whether indoors or outdoors.
IR LED
The Sensor Unit is equipped with Infrared (IR) LEDs on five faces, on both sides of each image sensor for stable detection, so as to recognise IR as auxiliary light, even in very dark conditions.
Visible Light Cut Unit
For venues with changeable lighting conditions, such as concerts, a Visible Light Cut Unit can be attached to block light other than infrared, ensuring feature points can still be recognized.
Real-time Graphics
Graphics-based video compositing can happen in real time. The Processing Box sends tracking, camera and lens metadata to CG rendering software like Unreal Engine via an Ethernet cable* in free-d format.
* 1000BASE-T, 100BASE-TX, 10BASE-T
FBX File Recording
Record tracking data as FBX files on a SDXC memory card UHS-II/UHS-I in the Processing Box. Capture the timecode and file name of the main camera at the same time, then easily match files with the main footage during post. And you can use the Rec Trigger found on many Sony cameras to activate this.
Other useful tools
Origin Point Setting
Align real-world co-ordinates and CG space with ease, using the Origin point chart. With this precise camera tracking data, CG and AR objects can be placed accurately. Fine adjustments to the origin position can then be made via the web menu—even without the Origin point chart.
Web Menu
Handle map creation, tracking, and lens calibration through an intuitive web menu. Review feature points and the relative position of the OCELLUS, or visualize those areas where learning was completed during map creation. And display the status to check tracking reliability.
Engineered to Survive
All three components of the OCELLUS camera tracking system are designed to be durable and resistant to dust and moisture*. Both the Sensor Unit and Processing Box feature cooling fans to prevent overheating and ensure reliable performance.
The Sensor Unit and Processing Box are connected by a single 2 m USB Type-C® cable, with locking mechanism, with the latter receiving 12 V or 24 V** power through a robust Fischer 3-pin input.
* Not guaranteed to be 100% dust and moisture proof
** The input voltage range is 11 V to 32 V
Camera not included – used for illustrative purposes only.