Live Link Face

Live Link Face

Download on App Store
Logo of Live Link Face

Live Link Face: Professional Facial Animation for Unreal Engine

Transform your iPhone into a high-fidelity performance capture rig. Stream real-time expressions to MetaHumans or record cinematic data for your production workflow.

Publisher

Epic Games International, S.a.r.l.

Category

Graphics & Design

Downloads

926K+

User Rating

3.6/5

Total Ratings

500

Locales

4

Studio Power in Your Pocket

Discover the interface used by 926K+ users.

Close up of a performer smiling for MetaHuman facial motion capture in Unreal Engine

Close up of a performer smiling for MetaHuman facial motion capture in Unreal Engine

A high-fidelity digital character created using Unreal Engine MetaHuman technology

A high-fidelity digital character created using Unreal Engine MetaHuman technology

Live Link Face app selection screen showing MetaHuman Animator and ARKit capture mode options.

Live Link Face app selection screen showing MetaHuman Animator and ARKit capture mode options.

A woman performing facial expressions in front of an iPhone mounted on a professional motion capture rig.

A woman performing facial expressions in front of an iPhone mounted on a professional motion capture rig.

Real-time facial motion capture interface with mesh overlay in Live Link Face

Real-time facial motion capture interface with mesh overlay in Live Link Face

Take Browser in Live Link Face showing a list of recorded facial capture performances

Take Browser in Live Link Face showing a list of recorded facial capture performances

Video playback of a captured facial performance in the Live Link Face app interface

Video playback of a captured facial performance in the Live Link Face app interface

Studio-Grade Facial Capture

The tools that make this app stand out, trusted by 926K+ users.

👤

MetaHuman Integration

Capture high-fidelity depth data and raw video to power the MetaHuman Animator, delivering AAA-quality facial performances in clicks.

âš¡

Real-Time Streaming

Stream ARKit animation data over your network to visualize expressions on your 3D mesh instantly with live rendering in Unreal Engine.

🎬

Professional Sync & Control

Maintain frame-accurate synchronization with Tentacle Sync timecode support and manage remote recordings via OSC for complex studio workflows.

About the app

Everything you need to know about Live Link Face.

Description

Live Link Face for effortless facial animation in Unreal Engine—capture performances for use in Unreal Engine from your iPhone or iPad. Capture facial performances for MetaHuman Animator: - Live Link Face supports both real-time and processed animation. - MetaHuman Animator uses Live Link Face to capture performances, then applies its own processing to create high-fidelity facial animation for MetaHumans. - The Live Link Face app captures raw video and depth data, which can be transmitted directly from your device into Unreal Engine for use with the MetaHuman plugin. - Facial animation created with MetaHuman Animator can be applied to any MetaHuman character in just a few clicks. - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11. Real-time animation for MetaHumans: - The Live Link Face app generates animation to drive a MetaHuman Character in real time. - Animation data is streamed to Unreal Engine over a network using the MetaHuman Live Link Plugin. - This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11. Real-time animation for non-MetaHuman characters: - Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network. - Visualize facial expressions in real time with live rendering in Unreal Engine. - Drive a 3D preview mesh, optionally overlaid over the video reference on the phone. - Record the raw ARKit animation data and front-facing video reference footage. - Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration. Timecode support for multi-device synchronization: - Select from the system clock, an NTP server, or use a Tentacle Sync device to connect with a master clock on stage. - Video reference is frame accurate with embedded timecode for editorial. Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine: - Trigger recording remotely so actors can focus on their performances. - Capture slate names and take numbers consistently. - Extract data for processing and storage. Browse and manage the library of captured takes: - Delete takes within Live Link Face, share via AirDrop. - Transfer directly over network when using MetaHuman Animator. - Review captured footage on-device.

Latest Version

1.6.0

Size

116.6 MB

First Released

Jul 7, 2020

Turn Your iPhone into a Professional Mocap Rig

Join the thousands of creators bringing AAA-quality facial animation to Unreal Engine. Download Live Link Face and start capturing high-fidelity performances today.

Download on App Store
FAQ

Frequently Asked Questions

Everything you need to know about Live Link Face

What is Live Link Face used for?

Live Link Face captures facial performances from an iPhone or iPad for Unreal Engine. It enables both real-time and processed facial animation for MetaHuman and other characters, integrating directly with Unreal Engine workflows.

Which devices are compatible with Live Link Face?

Live Link Face requires an iPhone 12 or above for facial performance capture. A desktop PC running Windows 10 or 11 is also necessary to utilize the captured data within Unreal Engine.

Does Live Link Face support MetaHuman Animator?

Yes, Live Link Face supports MetaHuman Animator. It captures raw video and depth data, which MetaHuman Animator processes to create high-fidelity facial animation for MetaHuman characters in Unreal Engine.

Can Live Link Face animate non-MetaHuman characters in real-time?

Yes, Live Link Face streams ARKit animation data live to an Unreal Engine instance. This enables real-time visualization of facial expressions on non-MetaHuman characters and supports custom 3D preview meshes.

How does Live Link Face manage captured facial performance takes?

Live Link Face manages captured takes through a take browser. Users can delete takes, share via AirDrop, transfer over a network for MetaHuman Animator, and review footage directly on the device.

Does Live Link Face support timecode synchronization?

Yes, Live Link Face offers timecode support for multi-device synchronization. Options include the system clock, an NTP server, or a Tentacle Sync device to connect with a master clock on stage.

Can I remotely control Live Link Face recordings?

Yes, users can remotely control Live Link Face via OSC or the MetaHuman Plugin for Unreal Engine. This allows for triggering recordings remotely and consistently capturing slate names and take numbers.

What frame rate does Live Link Face capture facial performance data?

Live Link Face captures facial performance data at 60 frames per second (FPS). This frame rate applies to both the real-time streaming feed and recorded takes, ensuring high fidelity animation.

Recommendations

More Like This

Apps with similar features and user experience

App information, icons, screenshots, and descriptions displayed on this page are sourced from the Apple App Store and are the property of their respective developers. Download estimates and rankings are based on MWM's proprietary models and may not reflect actual figures. This page is provided for informational and analytical purposes only.

Believe this page infringes your intellectual property? File a dispute