There is a great demand for machine learning and artificial intelligence applications in the audio domain, including home surveillance (detecting breaking glass and alarm events), security (detecting explosions and gun shots), self-driving cars (providing more security based on sound event detection), predictive maintenance (predict machine failures via vibrations in the manufacturing sector), emphasizing emotions in real-time translation, and music synthesis.
Swetha Machanavajhala and Xiaoyong Zhu explain how to make the auditory world inclusive and meet the great demand in other sectors by applying deep learning on audio in Azure. Swetha and Xiaoyong detail how to train a deep learning model on Microsoft Azure for sound event detection using an urban sounds dataset and offer an overview of working with audio data, along with references to Data Science Virtual Machine (DSVM) notebooks.
Swetha Machanavajhala is a software engineer for Azure Networking at Microsoft, where she builds tools to help engineers detect and diagnose network issues within seconds. She is very passionate about building products and awareness for people with disabilities and has led several related projects at hackathons, driving them from idea to reality to launching as a beta product and winning multiple awards. Swetha is a co-lead of the Disability Employee Resource Group, where she represents the community of people who are deaf or hard of hearing, and is a part of the ERG chair committee. She is also a frequent speaker at both internal and external events.
Xiaoyong Zhu is a senior data scientist at Microsoft, where he focuses on distributed machine learning and its applications.
For exhibition and sponsorship opportunities, email strataconf@oreilly.com
For information on trade opportunities with O'Reilly conferences, email partners@oreilly.com
View a complete list of Strata Data Conference contacts
©2018, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • confreg@oreilly.com