Digital character interaction is hard to fake–whether it’s between two characters, between users and characters, or between a character and its environment. Nevertheless, interaction is central to building immersive XR experiences, robotic simulation, and user-driven entertainment. Kevin He will discuss using physical simulation and deep learning to create interactive character technology.
The talk will cover “Interactive Motion”, and the physical underpinnings needed to build reactive characters—starting from building a physics engine optimized for joint articulation that can dynamically solve for multiple connected rigid bodies. We’ll cover how we used this engine to build a standard ragdoll, ascribe the ragdoll musculature and posture, and eventually to solve for basic locomotion via control of the simulated joints and muscles.
The talk will cover the applications of this real-time physics simulation for characters in XR—from dynamic full-body VR avatars, to interactive NPC and pets, to entirely interactive AR avatars and emojis.
We’ll then explain how we applied an AI model to our physically simulated character to generate what we call interactive “Motion Intelligence”. We give our character controllers sample motion data to learn motor skills via reverse engineering the motion in terms of simulated musculature. We also optimize for additional reward functions to ensure natural, flexible motion. The character can then produce the behavior in real-time physical simulation—transforming static animation data into a flexible and interactive skill. This is the first step in building a digital cerebellum.
We’ll then cover creating a “Motion Brain”. We did this by doing another level of training to stitch multiple behaviors into a control graph, allowing for seamless blending between learned motor skills in a variety of sequences.
Finally, we’ll look at advanced applications of Motion Intelligence and the product we built to create an applied AI tool for developers. We’ll cover building a basketball-playing, physically simulated character (as published in a paper for SIGGRAPH 2018) using deep reinforcement learning. These characters can control a 3D ball in space using advanced ball control skills.
Kevin He has spent his career pushing the boundaries of gaming and engineering. Prior to founding DeepMotion, Kevin was the former CTO of Disney’s mid-core mobile game studio, technical director of ROBLOX, and senior engine developer of World of Warcraft at Blizzard. Kevin was also a technical lead at Airespace (now Cisco Systems). He has 16 years of engineering and management experience with multiple AAA titles shipped, including: World of Warcraft, StarCraft II, Star Wars Commander and ROBLOX.
Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?
Join the conversation here (requires login)
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org