Inside the Tech - Solving for Avatar Facial Expressions - Roblox Blog (2024)

Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 20 of the podcast, Avatars & Self-Expression, Roblox CEO David Baszucki spoke with Senior Director of Engineering Kiran Bhat, Senior Director of Product Mahesh Ramasubramanian, and Principal Product Manager Effie Goenawan, about the future of immersive communication through avatars and the technical challenges we’re solving to enable it. In this edition of Inside the Tech, we talked with Engineering Manager Ian Sachs to learn more about one of those technical challenges—enabling facial expressions for our avatars—and how the Avatar Creation (under the Engine group) team’s work is helping users express themselves on Roblox.

What are the biggest technical challenges your team is taking on?

When we think about how an avatar represents someone on Roblox, we typically consider two things: How it behaves and how it looks. So one major focus for my team is enabling avatars to mirror a person’s expressions. For example, when someone smiles, their avatar smiles in sync with them.

One of the hard things about tracking facial expressions is tuning the efficiency of our model so that we can capture these expressions directly on the person’s device in real time. We’re committed to making this feature accessible to as many people on Roblox as possible, and we need to support a huge range of devices. The amount of compute power someone’s device can handle is a vital factor in that. We want everyone to be able to express themselves, not just people with powerful devices. So we’re deploying one of our first-ever deep learning models to make this possible.

The second key technical challenge we’re tackling is simplifying the process creators use to develop dynamic avatars people can personalize. Creating avatars like that is pretty complicated because you have to model the head and if you want it to animate, you have to do very specific things to rig the model, like placing joints and weights for linear blend skinning. We want to make this process easier for creators, so we’re developing technology to simplify it. They should only have to focus on building the static model. When they do, we can automatically rig and cage it. Then, facial tracking and layered clothing should work right off the bat.

What are some of the innovative approaches and solutions we’re using to tackle these technical challenges?

We’ve done a couple important things to ensure we get the right information for facial expressions. That starts with using industry-standard FACS (Facial Animation Control System). These are the key to everything because they’re what we use to drive an avatar’s facial expressions—how wide the mouth is, which eyes open and how much, and so on. We can use around 50 different FACS controls to describe a desired facial expression.

When you’re building a machine learning algorithm to estimate facial expressions from images or video, you train a model by showing it example images with known ground truth expressions (described with FACS). By showing the model many different images with different expressions, the model learns to estimate the facial expression of previously unseen faces.

Normally, when you’re working on facial tracking, these expressions are labeled by humans, and the easiest method is using landmarks—for example, placing dots on an image to mark the pixel locations of facial features like the corners of the eyes.

But FACS weights are different because you can’t look at a picture and say, “The mouth is open 0.9 vs. 0.5.” To solve for this, we’re using synthetic data to generate FACS weights directly that consist of 3D models rendered with FACS poses from different angles and lighting conditions.

Unfortunately, because the model needs to generalize to real faces, we can’t solely train on synthetic data. So we pre-train the model on a landmark prediction task using a combination of real and synthetic data, allowing the model to learn the FACS prediction task using purely synthetic data.

We want face tracking to work for everyone, but some devices are more powerful than others. This means we needed to build a system capable of dynamically adapting itself to the processing power of any device. We accomplished this by splitting our model into a fast approximate FACS prediction phase called BaseNet and a more accurate FACS refinement phase called HiFiNet. During runtime, the system measures its performance, and under optimal conditions, we run both model phases. But if a slowdown is detected (for example, because of a lower-end device), the system runs only the first phase.

What are some of the key things that you’ve learned from doing this technical work?

One is that getting a feature to work is such a small part of what it actually takes to release something successfully. A ton of the work is in the engineering and unit testing process. We need to make sure we have good ways of determining if we have a good pipeline of data. And we need to ask ourselves, “Hey, is this new model actually better than the old one?”

Before we even start the core engineering, all the pipelines we put in place for tracking experiments, ensuring our dataset represents the diversity of our users, evaluating results, and deploying and getting feedback on those new results go into making the model sufficient. But that’s a part of the process that doesn’t get talked about as much, even though it’s so critical.

Which Roblox value does your team most align with?

Understanding the phase of a project is key, so during innovation, taking the long view matters a lot, especially in research when you’re trying to solve important problems. But respecting the community is also crucial when you’re identifying the problems that are worth innovating on because we want to work on the problems with the most value to our broader community. For example, we specifically chose to work on “face tracking for all” rather than just “face tracking.” As you reach the 90 percent mark of building something, transitioning a prototype into a functional feature hinges on execution and adapting to the project’s stage.

What excites you the most about where Roblox and your team are headed?

I’ve always gravitated toward working on tools that help people be creative. Creating something is special because you end up with something that’s uniquely yours. I’ve worked in visual effects and on various photo editing tools, using math, science, research, and engineering insights to empower people to do really interesting things. Now, at Roblox, I get to take that to a whole new level. Roblox is a creativity platform, not just a tool. And the scale at which we get to build tools that enable creativity is much bigger than anything I’ve worked on before, which is incredibly exciting.

Inside the Tech - Solving for Avatar Facial Expressions - Roblox Blog (2024)

FAQs

How to do facial expressions in Roblox? ›

You can use the Face Animation Editor to access facial sliders that let you quickly create and save unique expressions on the timeline. You can use the Animation Capture for Faces extension to automatically track and record facial movements with a webcam that converts your movement into keyframes on the timeline.

What is the oldest avatar in Roblox? ›

2004-2005. In its earliest stages of development, Roblox did not have directly controllable avatars. The earliest known model to represent a human-like character was the "Child on Skateboard" model, which was created by David Baszucki in November 2004.

How do you use camera to animate your avatar with your movement? ›

How do I enable avatar animation with my movement? To enable avatar animation with your device camera: Toggle the “Use camera to animate your avatar with your movement” option. The toggle will turn from gray to green, indicating that avatar animation has been enabled for your account.

Can you get Roblox voice chat under 13? ›

All 13+ users, who are phone number verified, can opt-in to use this feature by visiting their Settings page. They will then be able to use chat with voice in any Roblox experience that supports it.

Was Roblox really made in 1988? ›

Founded in 2004 by David Baszucki and Erik Cassel, the company is the developer of Roblox, which was released in 2006. As of December 31, 2023, the company employs over 2,400 people. 2004 in Menlo Park, California, U.S.

What is y2k Roblox avatar? ›

It's referring to the style that was popular in the year 2000. Karasuneth· 1/6/2023. Yeah, it's just a certain style of fashion.

What was the first Roblox logo? ›

By mid to late 2004, Roblox changed its wordmark again to text with striking, red outlines and an accent over the 'o' – indicating a long vowel sound. This distinctly early-2000s look would be the first official logo used on the website.

How old do you have to be for face tracking on Roblox? ›

Roblox is a platform where users can create and publish games, as well as play one another's games using an avatar. On Tuesday, Roblox developers rolled out a new feature that allows some players thirteen and older to use face tracking, which would animate their avatar based on a live video feed of the player.

How is CGI used in avatar? ›

The CGI was used to create the film's digital characters and environments, while the practical effects were used to create the physical sets and props. The end result is a film that features some of the most realistic and expressive digital characters ever created.

Can you animate a real picture? ›

You can animate photos from your mobile device using our iOs or Android app. Upload your image on the editor, then choose Animate from the bottom navigation. Take advantage of the same animation styles and photo editing tools. Quickly produce high-quality animated pictures on your mobile just as you would on a desktop.

How do you get facial expressions? ›

To get good facial expression, don't be shy. Just let your face loosen a bit. Look also at mirror or phone camera to know that you express right emotion. Remember that someone who in touch with their emotion is so attractive.

How can you emote in Roblox? ›

To see the default emotes, hit the “/” key to chat and try typing in the following commands:
  1. /e wave.
  2. /e point.
  3. /e cheer.
  4. /e laugh.
  5. /e dance.
  6. /e dance2.
  7. /e dance3.

How do you change expressions on Roblox? ›

You can animate any supported head through the Animation Editor by manually setting poses on the animation track, using the Face Animation Editor to quickly create complex expressions, or using the Animation Capture tool to automatically track facial movements from your webcam.

Top Articles
Latest Posts
Article information

Author: Arielle Torp

Last Updated:

Views: 5347

Rating: 4 / 5 (61 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Arielle Torp

Birthday: 1997-09-20

Address: 87313 Erdman Vista, North Dustinborough, WA 37563

Phone: +97216742823598

Job: Central Technology Officer

Hobby: Taekwondo, Macrame, Foreign language learning, Kite flying, Cooking, Skiing, Computer programming

Introduction: My name is Arielle Torp, I am a comfortable, kind, zealous, lovely, jolly, colorful, adventurous person who loves writing and wants to share my knowledge and understanding with you.