Sep 9–12, 2019
Natural language processing with deep learning (Day 2)
Location: Santa Clara Room (Hilton)
Secondary topics:
Deep Learning,
Machine Learning,
Text, Language, and Speech
NLP involves the application of machine learning and other statistical techniques to derive insights from human language. With large volumes of data exchanged as text (in the form of documents, tweets, email, chat, and so on), NLP techniques are indispensable to modern intelligent applications. The applications range from enterprise to pedestrian.
Outline
Day 1
- Environment setup and data download
- Introduction to supervised learning
- Introduction to computational graphs
- Introduction to NLP and NLP tasks
- Representations for words: Word embeddings
- Hands-on: Word analogy problems
- Overview of deep learning frameworks
- Static versus dynamic
- PyTorch basics
- Hands-on: PyTorch exercises
- Feed-forward networks for NLP: Multi-layer perceptrons
- Hands-on: Chinese document classification
- Convolutional networks: Modeling subword units
- Hands-on: Classifying names to ethnicities
Day 2
- Sequence modeling: Basics of modeling sequences, representing sequences as tensors, the importance of the language modeling task
- Recurrent neural networks (RNNs) to model sequences: Basic ideas
- Hands-on: Classification with an RNN
- Gated variants (long short-term memory (LSTM) and gated recurrent unit (GRU))
- Structural variants (bidirectional, stacked, tree)
- Hands-on: Generating sequences with an RNN
- From sequence models to sequence-to-sequence models: Core ideas, encoder-decoder architectures, applications—translation and summarization
- Attention: Core ideas and its role in Seq2Seq models
- Advanced topics
- Self-attention and the Transformer
- Contextualized embedding models: BERT, ELMo
- Hands-on: BERT
- Overview of DL modeling for common NLP tasks
- Choose your own adventure
- Hands-on: Work with an NLP problem end-to-end from a selection of tasks
- DL for NLP: Best practices
- Closing: When to use deep learning for NLP, when not to use deep learning for NLP, and summary
Prerequisite knowledge
- Working knowledge of Python and command-line familiarity
- Familiarity with precalc math (multiply matrices, dot products of vectors) and derivatives of simple functions (useful but not required)
- General knowledge of machine learning (setting up experiments, evaluation, etc.) (useful but not required)
What you'll learn
- Understand basic concepts in natural language processing (NLP) and deep learning as it applies to NLP
- A hands-on approach to framing a real-world problem to the underlying NLP task and building a solution using Deep Learning
Presented by
Elite Sponsors
Strategic Sponsors
Diversity and Inclusion Sponsor
Impact Sponsors
Premier Exhibitor Plus
R & D and Innovation Track Sponsor
Contact us
confreg@oreilly.com
For conference registration information and customer service
partners@oreilly.com
For more information on community discounts and trade opportunities with O’Reilly conferences
Become a sponsor
For information on exhibiting or sponsoring a conference
pr@oreilly.com
For media/analyst press inquires