Presented By O'Reilly and Cloudera
Make Data Work
31 May–1 June 2016: Training
1 June–3 June 2016: Conference
London, UK

Applications of natural language understanding: Tools and technologies

Alyona Medelyan (Thematic)
14:05–14:45 Friday, 3/06/2016
Data science & advanced analytics
Location: Capital Suite 17 Level: Intermediate
Average rating: ***..
(3.33, 6 ratings)

Prerequisite knowledge

Attendees should have a basic understanding of NLP and machine learning.

Description

If you are dealing with natural language—whether in product reviews, user feedback, or interaction with customers—you are likely to benefit from the latest advances in natural language understanding. With the deep learning revolution, AI is no longer the privilege of large corporates; most businesses that deal with language data can apply NLU techniques in their existing solutions or create new services.

NLU algorithms are available in many open source tools, including machine-learning and deep learning toolkits or the word2vec package. And of course there are many commercially available APIs. But what are their advantages and limitations? When can you build your own solution, and when is it better to buy? Alyona Medelyan surveys the newest tools for dealing with language, showcases some common business use cases, and provides insight into what’s brewing in academic research and what we can expect in the near future.

Photo of Alyona Medelyan

Alyona Medelyan

Thematic

Alyona Medelyan has been working on algorithms that make sense of language data for over a decade. Her passion lies in helping businesses to extracting useful knowledge from text. As part of her PhD she has proven that her open source algorithm, Maui, can be as accurate as people at finding keywords. She has worked with large multinationals like Cisco and Google, has lead R&D teams and consulted to small and large companies around the globe. Alyona now runs Thematic, a customer insight company.

Comments on this page are now closed.

Comments

14/07/2016 13:22 BST

In my research paper, I build a NLTK sentiment engine using Lexicon methods, I can honestly say that there is not one specific Algorithm that makes one’s AI better than the next. AI is an ongoing evolving process, if you don’t keep on teaching your AI, how can it ever evolve in something more.