Multitask learning offers an approach to problem solving that allows supervised algorithms to master more than one objective (or task) at once and in parallel. Ryan Micallef offers an overview of opportunities for multitask learning in natural language processing, computer vision, and healthcare and shares a neural network, implemented in PyTorch, that is trained to classify news articles using multitask learning. You’ll learn how to implement custom layers, models, and loss functions in PyTorch and explore private-share component analysis.
Multitask training, combined with regularization techniques such as adversarial training, allows neural networks to learn task-specific and task-agnostic representations (with respect to the tasks part of the multitask training setup). The analysis of these representations (i.e., private-shared component analysis) allows insights into how task relate to each other. The prototype demonstrates the use of publication-specific language and language shared across publication in articles of different types, from politics to sports and economics, based on multitask learning and private-shared component analysis.
Ryan Micallef is a research engineer at Cloudera Fast Forward Labs focused on studying emerging machine learning technologies and helping clients apply them. Ryan is also an attorney barred in New York and spent nearly a decade as an intellectual property litigator focused on technical cases. Ryan holds a bachelor’s degree in computer science from Georgia Tech and a JD from Brooklyn Law School. He spends his free time soldering circuits and wrenching motorcycles. He also teaches microcontroller programming at his local hackerspace, NYC Resistor.
©2018, O’Reilly UK Ltd • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com