Deep learning technologies for giant hogweed eradication





Who is this presentation for?
- IT engineers and managers who want to know actual use cases and leverage machine learning technologies with big data technologies
Level
Description
Giant hogweed is a highly toxic plant originating in the Western Caucasus. It spread across central and western Europe, and there are sightings of giant hogweed reported from North America. Landowners are obliged to eradicate it due to its toxicity and invasive nature in Europe. However, it’s difficult for landowners to find and remove giant hogweed across large areas of land, since it’s a very cumbersome manual process.
Automating the process of detecting giant hogweed by exploiting technologies like drones and image recognition and detection using machine learning is an effective way to address this problem. Naoto Umemori and Masaru Dobashi outline how they designed the architecture, took advantage of big data and machine and deep learning technologies, and the lessons learned through this project. For example, they integrated a drone, Apache Hadoop, Apache Spark, and TensorFlow to achieve the usability, flexibility, and scalability for data engineers and data analysts. You’ll discover why this integration was needed, technical challenges from the viewpoint of enterprises, and tips to leverage this open source software.
Prerequisite knowledge
- A basic knowledge of Hadoop, Spark, and TensorFlow (useful but not required)
What you'll learn
- Learn to leverage machine and deep learning and big data technologies and how to efficiently use open source software and architecture design based on use cases

Naoto Umemori
NTT DATA
Naoto Umemori is a senior infrastructure engineer and deputy manager at NTT DATA, working in the technology and innovation area. He’s spent around 10 years in the platform and infrastructure field, focusing mainly on the open source software technology stack.

Masaru Dobashi
NTT DATA
Masaru Dobashi is a manager, IT specialist and architect at NTT DATA, where he leads the OSS professional service team and is responsible for introducing Hadoop, Spark, Storm, and other OSS middleware into enterprise systems. Previously, Masaru developed an enterprise Hadoop cluster consisting of over 1,000 nodes—one of the largest Hadoop clusters in Japan—and designed and provisioned several kinds of clusters using non-Hadoop open source software, such as Spark and Storm.
Presented by
Elite Sponsors
Strategic Sponsors
Zettabyte Sponsors
Contributing Sponsors
Exabyte Sponsors
Content Sponsor
Impact Sponsors
Supporting Sponsor
Non Profit
Contact us
confreg@oreilly.com
For conference registration information and customer service
partners@oreilly.com
For more information on community discounts and trade opportunities with O’Reilly conferences
strataconf@oreilly.com
For information on exhibiting or sponsoring a conference
pr@oreilly.com
For media/analyst press inquires