Recent years have seen significant evolvement of deep learning and AI capabilities. AI solutions can augment or replace mundane tasks, increase workforce productivity, and relieve human bottlenecks. Unlike traditional automation, these solutions include cognitive aspects that used to require human decision making. In some cases, deep learning has proven to be even more accurate than humans in identifying patterns and therefore can be effectively used to enable various kinds of automated, real-time decision making.
The advanced analytics team at Intel IT recently implemented an internal visual inference platform—a high-performance system for deep learning inference—designed for production environments. This innovative system enables easy deployment of many DL models in production while enabling a closed feedback loop where data flows in and decisions are returned through a fast REST API. The system maximizes throughputs through batching and smart in-memory caching while maintaining its ability to support long short-term memory networks. It can be deployed either as a cluster or standalone node.
To enable stream analytics at scale, the system was built in a modern microservices architecture using cutting-edge technologies such as TensorFlow, TensorFlow Serving, Redis, Flask, and more. It’s optimized to be easily deployed with Docker and Kubernetes and cuts down time to market for deploying a DL solution. By supporting different kinds of models and various inputs, including images and video streams, this system can enable the deployment of smart visual inspection solutions with real-time decision making.
Moty Fania and Sergei Kom explain how Intel implemented the platform and share lessons learned along the way.
Topics include:
Moty Fania is a principal engineer and the CTO of the Advanced Analytics Group at Intel, which delivers AI and big data solutions across Intel. Moty has rich experience in ML engineering, analytics, data warehousing, and decision-support solutions. He led the architecture work and development of various AI and big data initiatives such as IoT systems, predictive engines, online inference systems, and more.
Sergei Kom is a senior software engineer in Intel’s Advanced Analytics Department. Sergei has a lot of experience in developing real-time applications using Spark Streaming, Kafka, Kafka Streams, and TensorFlow Serving. He enjoys learning new technologies and implement them in new projects.
For exhibition and sponsorship opportunities, email strataconf@oreilly.com
For information on trade opportunities with O'Reilly conferences, email partners@oreilly.com
View a complete list of Strata Data Conference contacts
©2018, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • confreg@oreilly.com