Presented By O’Reilly and Intel Nervana
Put AI to work
September 17-18, 2017: Training
September 18-20, 2017: Tutorials & Conference
San Francisco, CA

word2vec and friends

Bruno Gonçalves (New York University)
9:00am–12:30pm Monday, September 18, 2017
Implementing AI
Location: Nob Hill 2 & 3 Level: Intermediate
Secondary topics:  Algorithms, Case studies
Average rating: *****
(5.00, 1 rating)

Prerequisite Knowledge

  • A basic understanding of TensorFlow and feed-forward neural networks

Materials or downloads needed in advance

  • A laptop with Python and TensorFlow installed

What you'll learn

  • Explore the neural network architecture of word2vec type models
  • Learn how they are implemented in TensorFlow and how related models might be developed for future applications

Description

Word embeddings have received a lot of attention ever since Tomas Mikolov published word2vec in 2013 and showed that the embeddings that a neural network learned by “reading” a large corpus of text preserved semantic relations between words. As a result, this type of embedding began to be studied in more detail and applied to more serious NLP and IR tasks, such as summarization and query expansion. More recently, researchers and practitioners alike have come to appreciate the power of this type of approach, creating a burgeoning cottage industry centered around applying Mikolov’s original approach to different areas.

Bruno Gonçalves explores word2vec and its variations, discussing the main concepts and algorithms behind the neural network architecture used in word2vec and the word2vec reference implementation in TensorFlow. Bruno then presents a bird’s-eye view of the emerging field of 2vec methods (phrase2vec, doc2vec, dna2vec, node2vec, etc.) that use variations of the word2vec neural network architecture.

Outline:

Neural network architecture and algorithms underlying word2vec

  • Basic intuition
  • Skip-gram
  • Continuous bag of words
  • Hierarchical softmax
  • Cross-entropy
  • Negative sampling
  • Semantic structure and analogies
  • Online sources for pretrained embeddings

A brief refresher of TensorFlow

A detailed discussion of TensorFlow’s reference implementation

word2vec variations and their applications

Photo of Bruno Gonçalves

Bruno Gonçalves

New York University

Bruno Gonçalves is a Moore-Sloan fellow at NYU’s Center for Data Science. With a background in physics and computer science, Bruno has spent his career exploring the use of datasets from sources as diverse as Apache web logs, Wikipedia edits, Twitter posts, epidemiological reports, and census data to analyze and model human behavior and mobility. More recently, he has been focusing on the application of machine learning and neural network techniques to analyze large geolocated datasets.

Leave a Comment or Question

Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?

Join the conversation here (requires login)

Comments

Gabriel Moreira | MACHINE LEARNING ENGINEER
09/12/2017 7:22am PDT

Will you cover the application of RNN and CNN for NLP?