New third-party apps and creative methods allow you to create something far removed from the world of Animoji & Memoji.

It's not even been a year since the release of the iPhone X with its TrueDepth camera and facial recognition software, and since the phone made its debut, third-party app makers and creators have started to explore the possibilities of TrueDepth. This will be of interest to any artists wondering if they can unlock the capabilities of their iPhone X for motion capture work, and produce something more substantive than a nodding chicken head to send to their friends.

Anyone wondering if body capture work is possible with the iPhone X can try a technique being pioneered by Cory Strassburger, co-founder of LA-based cinematic VR studio Kite & Lightning. Using an iPhone X in tandem with Xsens inertial motion capture technology, Cory shows you can produce simultaneous full-body and facial performance capture, with the final animated character live streamed, transfered and cleaned via IKINEMA LiveAction to Epic Games’ Unreal Engine. This is all done in total real time, as the below video demonstrates.

The method relies on a DIY mocap helmet with an iPhone X directed at the user’s face, an Xsens MVN system and IKINEMA LiveAction to stream and retarget the motion to your character of choice in Unreal Engine. Via this setup, users can act out a scene wherever they are, as Cory will demonstrate at this year's Siggraph - where Digital Arts will be in attendance, FYI.

Those interested in facial capture work meanwhile can have a go with Live Face, an app which was recently featured on our sister site Macworld. Released by Reallusion in June, this free app feeds facial tracking data directly in real time to a Mac or PC, using your iPhone X as a hotspot, and connecting up to your computer and tracking data points on the face via Wi-Fi.

It'd be wise to note though that the end data is received and processed by Reallusion's CrazyTalk Animator 3 suite; the app is currently only compatible with this software, meaning anyone without it will have to shell out £84.99/$89.99 on the App store if interested.

Another option for facial capture is CV-AR from Maxon, a free app also released in June that's compatible with the grandstand that is Cinema 4D. The software captures your facial animation and sends it to C4D, textures and sound included, with the data stored locally on your iPhone inside the app itself.

The app is designed to make the capture and transfer of facial animation as seamless and effortless as possible; transfers are made possible by scanning a QR code, so there are no hotspot or USB options with this one.

A more low-key release is Face Cap from solo developer Niels Jansson. This one is very interesting as its output can work with not only C4D, but also Lightwave 3D, Autodesk Maya, Houdini and Blender, putting its competitors to shame (insert blush-face emoji). 

With Face Cap you can record 50 different facial expressions for a recording duration of up to 10 minutes. It exports generic mesh work, blendshapes and animation as Ascii-fbx, and offers a native IOS sharing interface so you can email or Dropbox your recordings.

Lastly, let's have a little look at something different - the Animation for iPhone X service from France's Polywink.

Polywink is a new online platform for 3D facial animation that aims to save studios and 3D professionals time and budget by automatically generating blendshapes and rigs. In other words, it's an outsourcing service, with its iPhone X option allowing you to upload a neutral 3D head model and receive a character head ready to be animated. 

The service automatically generates a set of 51 blendshapes as adapted to the specific topology and morphology of your character. It closely follows ARKit documentation, meaning you can plug your model into the ARKit Unity Plugin and let the iPhone's face tracking do the rest; no additional rigging and modelling is required. The service will set you back $299.00 per head, and promises a 24 hour turnaround.

Read next:  Future iPhones may include a laser-powered, 3D-sensing rear camera to augment AR