Mar 15–18, 2020

Deepfakes 2.0: How neural networks are changing our world

11:00am11:40am Wednesday, March 18, 2020
Location: 210 E

Who is this presentation for?

  • Software developers, managers, and decision makers




Imagine looking into a mirror, but not seeing your own face. Instead, you’re looking in the eyes of Barack Obama or Angela Merkel. Your facial expressions are seamlessly transferred to the other person’s face in real time.

Martin Förtsch and Thomas Endres dig into a prototype developed by the TNG hardware hacking team that transfers faces from one person to another in real time based on deepfakes. Neural networks detect faces within a video input, then translate them and integrate them back to the video output. Through this technique it’s possible to project deceptively real imitations onto other people. The team used Keras-trained autoencoder networks and various face recognition algorithms.

You’ll get an entertaining and vivid introduction into the world of real-time deepfakes, with a particular focus on the deep learning techniques used within the application. You’ll see several live demonstrations and explore legislation and ethics.

Prerequisite knowledge

  • A basic understanding of terms like neural network, deep neural network, and autoencoder
  • Experience in IT (useful but not required)

What you'll learn

  • Learn how deepfakes work in its original implementation, how Deepfakes 2.0 were realized, and which challenges had to be solved
  • Discover how different types of neural networks like MobileNet and U-Net can be used for transfer learning
  • See demonstrations like neural network-powered face segmentation, image inpainting, and generative inpainting
Photo of Martin Förtsch

Martin Förtsch


Martin Förtsch is an IT consultant at TNG, based in Unterföhring near Munich who studied computer sciences. His focus areas are Agile development (mainly) in Java, search engine technologies, information retrieval and databases. As an Intel Software Innovator and Intel Black Belt Software Developer, he’s strongly involved in the development of open source software for gesture control with 3D cameras like Intel RealSense and has built an augmented reality wearable prototype device with his team based on this technology. He gives many talks on national and international conferences about AI, the internet of things, 3D camera technologies, augmented reality, and test-driven development. He was awarded with the JavaOne Rockstar award.

Photo of Thomas Endres

Thomas Endres


Thomas Endres is an associate partner at TNG in Munich. Besides his normal work for TNG’s customers, he creates prototypes with the company’s hardware hacking team, such as a see-through augmented reality device and a telepresence robotics system. In his spare time, he works on gesture control applications, such as those for controlling quadrocopters with bare hands. He’s also involved in open source projects written in Java, C#, and all kinds of JavaScript languages. He’s also a lecturer at the University of Applied Sciences in Landshut. Thomas is passionate about software development and all the other aspects of technology. As an Intel Software Innovator and Black Belt, he promotes new technologies like gesture control, AR/VR, and robotics around the world. He recently received a JavaOne Rockstar award. He studied IT at the TU Munich.

Leave a Comment or Question

Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?

Join the conversation here (requires login)

Contact us

For conference registration information and customer service

For more information on community discounts and trade opportunities with O’Reilly conferences

Become a sponsor

For information on exhibiting or sponsoring a conference

For media/analyst press inquires