Ming-Wei Chang is a research scientist in Google AI Language, Seattle. He enjoys developing interesting machine learning algorithms for practical problems, especially in the field of natural language processing. He has won an Outstanding Paper award at ACL 2015 for his work on question answering over knowledge bases. Over the years, he has published more than 35 papers on the top-tier conferences and won several international machine learning competitions including entity linking, power load forecast prediction and sequential data classification. Together with his colleagues in Google AI Language, his recent paper, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding“, has demonstrated the power of language model pre-training and obtain the new state-of-the-art over 11 natural language processing tasks.
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • firstname.lastname@example.org