• Added support for Left/Right Eye Yaw/Pitch/Roll.
• Added optional warning message when recording at less than 60 FPS.
• Improved handling of dropped frames in reference videos.
• Improved timecode reliability with NTP and Tentacle Sync.
Virtual production-ready facial animation in real time from your iPhone -- Live Link Face for UnrealEngine.
Stream high-quality facial expressions to characters and visualize them with live rendering in UnrealEngine. Record facial tracking data that can be further fine-tuned in animation tools to achieve a finalperformance and assembled in Unreal Engine’s Sequencer. Shoot professional-grade performance capture with an integrated stage workflow.
Facial animation via front-camera and ARKit:
● Stream out the data live to an Unreal Engine instance via Live Link over a network.
● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
● Record the raw facial animation data and front-facing video reference footage.
Timecode support for multi-device synchronization:
● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
● Video reference is frame accurate with embedded timecode for editorial.
Control Live Link Face remotely with OSC:
● Trigger recording externally so actors can focus on their performances.
● Capture slate names and take numbers consistently.
● Extract data automatically for archival.
Browse and manage the captured library of takes within Live Link Face:
● Delete takes, share via AirDrop.
● Play back the reference video on the phone.
- October 23, 2020 Initial release