Index:

  1. 1.0: General:

  2. 2.0: The OSC data structure:

  3. 3.0: OSC Receiver Unity example:

  4. 4.0: External links:


 

1.1 What is live mode:

Live mode transmits captured animation data over a local WIFI network using the OSC protocol.

The idea is to provide users with an 'open' way to integrate facial motion capture data into their projects.

Because project needs can vary widely it is up to users to develop a receiver that suits their needs. However as Unity is a very common platform an example project is provided. You will also find some external links to projects that enable Face Cap data for other software package/platforms.


 

1.1 What is OSC:

Open Sound Control (OSC) is a protocol for communication among computers, sound synthesizers, and other multimedia devices that is optimized for modern networking technology.

OSC is easy to implement as libraries are widely available in many different programming languages. For example for Unity, VVVV, Processing, Open framworks, Arduino, etc.

OSC allows the data Face Cap sends to be used to drive pretty much anything. Use the data to do eye tracking or drive a synthesizer with your face etc.


 

2.1 OSC Addresses:

  1. Address: /HT + 3 Floats (x,y,z) = Head position.
  2. Address: /HR + 3 Floats (x,y,z) = Head rotation in degrees.
  3. Address: /HRQ + 4 Floats (x,y,z,w) = Head rotation as quaternion.
  4. Address: /ELR + 2 Floats (x,y) = Eye left rotation.
  5. Address: /ERR + 2 Floats (x,y) = Eye right rotation.
  6. Address: /W + 1 Int + 1 Float (blendshape index, value) = Blendshape parameters.

Additional:
You might have to compensate for your application's coordinate ststem and units. For Example in Unity the coordinate system is right handed with -z. This can resulting in mirrored translation and rotations if not converted/compensated.


 

2.2 Blendshape indices:

The naming convention: _L and _R indicate symmetrical shapes. The naming convention Left and Right indicate a direction in non symmetrical shapes

  1. Blendshape index : Blendshape name
  2. 00 : browInnerUp
  3. 01 : browDown_L
  4. 02 : browDown_R
  5. 03 : browOuterUp_L
  6. 04 : browOuterUp_R
  7. 05 : eyeLookUp_L
  8. 06 : eyeLookUp_R
  9. 07 : eyeLookDown_L
  10. 08 : eyeLookDown_R
  11. 09 : eyeLookIn_L
  12. 10 : eyeLookIn_R
  13. 11 : eyeLookOut_L
  14. 12 : eyeLookOut_R
  15. 13 : eyeBlink_L
  16. 14 : eyeBlink_R
  17. 15 : eyeSquint_L
  18. 16 : eyeSquint_R
  19. 17 : eyeWide_L
  20. 18 : eyeWide_R
  21. 19 : cheekPuff
  22. 20 : cheekSquint_L
  23. 21 : cheekSquint_R
  24. 22 : noseSneer_L
  25. 23 : noseSneer_R
  26. 24 : jawOpen
  27. 25 : jawForward
  28. 26 : jawLeft
  29. 27 : jawRight
  30. 28 : mouthFunnel
  31. 29 : mouthPucker
  32. 30 : mouthLeft
  33. 31 : mouthRight
  34. 32 : mouthRollUpper
  35. 33 : mouthRollLower
  36. 34 : mouthShrugUpper
  37. 35 : mouthShrugLower
  38. 36 : mouthClose
  39. 37 : mouthSmile_L
  40. 38 : mouthSmile_R
  41. 39 : mouthFrown_L
  42. 40 : mouthFrown_R
  43. 41 : mouthDimple_L
  44. 42 : mouthDimple_R
  45. 43 : mouthUpperUp_L
  46. 44 : mouthUpperUp_R
  47. 45 : mouthLowerDown_L
  48. 46 : mouthLowerDown_R
  49. 47 : mouthPress_L
  50. 48 : mouthPress_R
  51. 49 : mouthStretch_L
  52. 50 : mouthStretch_R
  53. 51 : tongueOut

 

3.1 Live mode OSC Receiver Unity project.

The Unity live mode OSC Receiver project is stored and maintained on Github: Download it here.


 

3.2 Setup.

  1. Install Unity v2019.3.0f3 or newer.
    • Sign up for a Unity account.
  2. Make sure your PC or Mac is connected to the same WIFI as your iOS device.
    • Make sure Unity is allowed through the firewall
    • Disable your VPN if you have one.
  3. Start Unity and open the live mode OSC receiver project.
    • From the project window load the scene : FaceCapOSCReceiverGenericExample.
    • Play the scene and select the scripts not in the hierarchy window.
    • Note the IP address and port and enter this when connecting live mode in the app.
    • You should see the app and Unity move in sync.
  4. You've gone through the setup process but Unity does not respond.
    • In 99% of cases Unity is being blocked by the operating system firewall. Dig into the firewall settings and make sure Unity is allowed through.
    • If this does not resolve your issue please use the contact form and we'll debug the network together.

 

3.2 Custom avatar setup.

The Unity project contains two example scenes. One for avatars that contain the exact same blendshapes and blendshape names as the Face Cap generic avatar and one for custom avatars that do not have all the blendshapes or have different blendshape naming. Examine the FaceCapOSCReceiverCustomExample to get an idea for how it's setup.

Steps to configure a custom avatar:
  1. A new FaceCapRemappingObject needs to be created. Right click in the project window to create one.
  2. Click on the newly created FaceCapRemappingObject to configure it. Assign the custom avatar mesh that contains the blendshapes and match the custom avatar blendshapes to those in the Face Cap data. Save the project to save your configuration.
  3. Next the setup in the FaceCapOSCReceiverCustomExample scene needs to be replicated. Duplicate the scene in the project window and open it. Removed and replaced the avatar with one of your own. In the scripts node assign the required properties including the newly created FaceCapRemappingObject.

You should now be all set to drive the custom avatar using Face Cap.


 

4.1 Sergei Solokhin's plugin for Motion builder.

  1. Sergei Solokhin has created a plugin for live streaming to motion builder.
    Get it here: github.

 

4.2 JPfeP's add on for Blender.

  1. JPfeP has created an addon for live streaming into Blender.
    Get it here: Add routes.