Presented By
O’Reilly + Intel AI
Put AI to Work
April 15-18, 2019
New York, NY
Discover opportunities for applied AI
Organizations that successfully apply AI innovate and compete more effectively. How is AI transforming your business?
Be a part of the program—apply to speak by October 16.
Chang Ming-Wei

Chang Ming-Wei
Research Scientist, Google


Ming-Wei Chang is a research scientist in Google AI Language, Seattle. He enjoys developing interesting machine learning algorithms for practical problems, especially in the field of natural language processing. He has won an Outstanding Paper award at ACL 2015 for his work on question answering over knowledge bases. Over the years, he has published more than 35 papers on the top-tier conferences and won several international machine learning competitions including entity linking, power load forecast prediction and sequential data classification. Together with his colleagues in Google AI Language, his recent paper, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding“, has demonstrated the power of language model pre-training and obtain the new state-of-the-art over 11 natural language processing tasks.


1:00pm1:40pm Thursday, April 18, 2019
Machine Learning, Models and Methods
Location: Regent Parlor
Secondary topics:  Models and Methods, Text, Language, and Speech
Chang Ming-Wei (Google)
We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers. Read more.