This tutorial covers getting started with Faceware Live Server. You'll learn how to select a video source and configure it, how to calibrate your facial tracking, how to preview the animation, and how to stream data from the application.
(This guide assumes you have already downloaded and installed Faceware Live Server.)
Step 1: Setup your Video
The first thing to do is choose your Video Source. You do this in the Video Settings window. If you have a webcam or camera connected when you open the app it may immediately detect it and start playing video. Alternatively, you may see a message in the center of the application that reads, "No video source selected." This means that there is no video source currently available and you'll need to select one before you can begin.
To setup your video:
- Open Video Setup by selecting "Configure Video Setup" from the Video menu.
- Select the video source you'd like to use from the drop-down menu titled, "Video Source". A video with a frame-rate of at least 60 frames per second is recommended for real-time facial animation.
- If you're using a head-mounted camera you can use the built-in tools to properly align your image for optimal tracking. If necessary, rotate the image using the 'Rotate Image" drop-down menu. You can also mirror the image horizontally or vertically.
If you're using a webcam, simply make sure that your entire head is in the frame and that the lighting in the image is enough that your face is clearly visible.
Step 2: Calibrate your Facial Tracking
Calibration in Faceware Live Server is a one-click process that sets a baseline for the software to work from while tracking. It's very quick and easy, but is important to the process.
To calibrate your facial tracking:
- Open the Calibration Settings by selecting "Calibration Settings" from the Calibration menu.
- Select the correct Tracking Model for your video (found under 'Tracking Options'). You'll see two Tracking Models, "StaticCam" and "HeadCam". A "Static" camera is a camera that doesn't move, like a webcam or camera on a tripod. A "HeadCam" is a camera attached to your head that moves with you.
*Note- If you are unsure of which one to use or are having issues with the HeadCam model, select the StaticCam one.
- Calibrate your face by looking straight forward at the camera while making a neutral face (no expression, completely relaxed, no movement or emotion). Next, click "Calibrate Face". The Tracking landmarks on the face will pause for just a second and if successful, you should see the Tracking landmarks now following your facial features. You can verify that calibration was successful by looking in the status bar where it should say "Calibrated."
You should now see landmark points tracking your face. If you don't see them, verify you're using the correct Tracking Model and that the status bar says, "Calibrated".
Step 3: Preview the Animation
Now that Live Server is tracking your face you can preview the animation it's creating.
To preview the animation:
- Select "Animation Preview" in the View menu.
This will bring up a move-able, re-sizable window where you can view your animation on the included character. This is useful for verifying your Calibration was successful as well as testing how the tracker is working on various people, cameras, and lighting conditions.
*Note - You can change the background color of the Animation Preview in the Preferences window. You can also do a complete full-screen of the Animation Preview by clicking the 'maximize' icon.
Step 4: Stream the Animation Data
Lastly, you'll want to stream the animation data from Live Server so that Live Clients (such as MotionBuilder or Unity) can grab the data and put the facial animation onto your characters.
To begin streaming the animation:
- Select a Port Number in the toolbar.
- Click the 'Streaming' icon to the right of the Port Number. You can toggle this ON or OFF by selecting that same icon.
When data is streaming from Live Server, you will see "Streaming Data" in the status bar.
You can now connect to Live Server via an official Live Client or via coding your own plugin and connecting to the TCP stream.