Ming-Wei Chang is a research scientist at Google AI Language. He enjoys developing interesting machine learning algorithms for practical problems, especially in the field of natural language processing. He has published more than 35 papers at top-tier conferences and won an outstanding paper award at ACL 2015 for his work on question answering over knowledge bases. He also won several international machine learning competitions on topics like entity linking, power load forecast prediction, and sequential data classification. His recent paper, “BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding“—cowritten with his colleagues in Google AI Language—demonstrates the power of language model pretraining and details the new state of the art for over 11 natural language processing tasks.
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org