Skip to content

Instantly share code, notes, and snippets.

@memmam
Last active May 2, 2022 01:14
Show Gist options
  • Save memmam/c18de006460383f7e3ac6c0879c11df4 to your computer and use it in GitHub Desktop.
Save memmam/c18de006460383f7e3ac6c0879c11df4 to your computer and use it in GitHub Desktop.
Mocap helmet instructions

Mocap Helmet Build Guide

Parts list

Needed software

  • VSeeFace - Free, this guide is written assuming you are using VSeeFace as your primary tracking software

You will also need one of the following facial tracking programs:

  • VTube Studio (iPhone) / (Android) - Needed functionality is free, works with any iPhone with FaceID, 2nd/3rd-gen iPhone SE, Android devices (although tracking isn't QUITE as good as on iPhone)
  • iFacialMocap - $7.99, ONLY works with iPhones with FaceID
  • FaceMotion3D - Free to test, $15.99 to own, works with any iPhone with FaceID OR the 2nd/3rd-gen iPhone SE, unlike the other two options it allows for fine-tuning blendshapes. Buy the 'Other' tracking option, NOT the 'Unity' tracking option!

You can also use Waidayo if you have an iPhone with FaceID, but keep in mind this will use up one of your VMC receivers in VSeeFace, whereas VSeeFace has dedicated inputs for VTube Studio, iFacialMocap, or FaceMotion3D. Given this setup requires you to use a separate method like VirtualMotionCapture for head position data, using Waidayo is not advised, as this will use up BOTH of your VMC receivers, preventing you from using things like the Twitch Integrated Throwing System.

Alternatively, you can also use EpocCam (if on iPhone), or DroidCam (if on an Android device), using the phone as a simple webcam instead of using facial motion capture; I personally am going this route due to the fact that my low-poly model doesn't actually benefit from motion capture tracking due to how it works, and VSeeFace's expression detection only works with webcam tracking.

Instructions

We will roughly be following the helmet build seen here and here, with some alterations. Credit to filmmaker Jae Solina, who made the above videos and original helmet design.

First, some notes about part alternatives. The SUREWO and HSU aluminum GoPro mount/extension arm were chosen for stability. You can use the Walway and OctinPris hardware instead, but since it's plastic, it's far less stable and prone to swaying when you move, which makes it unsuitable for e.g. exercising or dancing.

The submersible LED tea lights can likely be replaced with a cheaper alternative, I bought the ones that Jae Solina used because that way I have a spare remote and a bunch of spare lights in case something happens, and batteries are included for all of them. This can be omitted entirely if you don't need lighting.

First, get the skateboard helmet. Any will do as long as it's dome shaped and has two holes in the front and two holes in the back. Personally, I used a Triple 8 skateboard helmet instead of the OutdoorMaster helmet, it's what I had on hand and they seem identical for the needs of this project. Mount the GoPro motorcycle helmet mount on the forehead area of the helmet as low as it'll go using the included VHB tape, with the teeth of the GoPro mount vertical, NOT horizontal.

Get your medium-length GoPro extension arm. Using the included tool and one of the included screws, attach the arm to the motorcycle mount. Get the GoPro tripod mount, and do the same, attaching it to the other end of the extension arm.

Attach the phone mount to the tripod mount and insert the phone. You want the phone to be in portrait orientation, not landscape, hanging down from the GoPro arm.

At this point, if you need positional tracking (e.g. if you are following my mocap suit guide), use a tracker strap threaded through the holes on the front of the helmet or some of your left-over VHB tape to attach a VR tracker just above where the motorcycle mount is on the helmet.

If you bought the submersible LED lights, cut some VHB tape into half-circles to fit inside the gaps on the bottom of one of the lights, and affix it to the underside of the GoPro extension arm, taking care it is not too close to you or too close to the phone; if it's too close to you it might blind you, if it's too close to the phone it'll blow out the image.

Using 3M Dual-Lock, affix the battery to the back of the helmet, and then secure it with zipties using the two holes on the back of the helmet. Next, attach the switching plate to the helmet over top of the battery. Here is an example photo. I elected to use the two included screws just for extra weight / so I don't lose track of them; they aren't doing anything important and can be omitted.

Using yet more zipties to secure it, run the USB-C to Lightning cable over the top of the helmet, so it can plug into the phone and into the battery without hanging down.

For the sake of documentation, I should mention that I initially attempted to use a photography light that attached to the phone using a cold shoe adapter built into a less expensive phone holder; not only did the phone holder need to be replaced because it was far too flimsy, but the light added too much weight for the helmet to be usable. I HEAVILY recommend the submersible LED tea lights linked above instead.

Here are some photos of my finished product, for reference: [1], [2], [3], [4]

From here, set up iPhone face tracking as normal; the procedure for this will differ depending on what software you're using. If you are using VTube Studio, iFacialMocap, or FaceMotion3D, enable 'iPhone/ARKit tracking receiver' in VSeeFace General Settings, select the tracking app you're using, and make sure 'Receive head movement' is unchecked. If you are using Waidayo, use one of the OSC/VMC protocol receivers instead. If you are using Waidayo with the the primary receiver, make sure you are not applying facial features from VSeeFace, or if you are using Waidayo with the secondary receiver, make sure 'Apply blendshapes', 'Apply eye bones', and 'Apply jaw bone' are checked.

With this setup you will not be receiving head/neck position data from the iPhone, as it is affixed to your head. If you followed the steps correctly, VSeeFace should NOT be attempting to apply iPhone-tracked head position data; you must get this from some other VMC-based tracking solution, which is why my helmet has a VR tracker on it.

You should be done!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment