Idealo.de has a dedicated service to provide hotel price comparisons. The company receives dozens of images for each hotel and faces the challenge of choosing the most “attractive” image for its offer comparison pages, as photos can be just as important for bookings as reviews. The millions of hotel offers mean that there are more than 100 million images that need an “attractiveness” assessment.
Idealo.de addressed this challenge by implementing an aesthetic and technical image quality classifier based on Google’s research paper “NIMA: Neural Image Assessment." NIMA consists of two convolutional neural networks (CNN) that aim to predict the aesthetic and technical quality of images, respectively. The models are trained via transfer learning, where ImageNet pretrained CNNs are fine-tuned for each quality classification task.
Christopher Lennan shares the training approach and peculiarities of the models (e.g., the Earth Mover’s Distance objective function) as well as major insights gained from each iteration, including the importance of collecting high-quality labeled data. Finally, he sheds light on what the trained models actually learned by visualizing the convolutional filter weights and output nodes of the trained models and illustrates how this helped idealo.de optimize the models.
Christopher Lennan is a senior data scientist at idealo.de, where he works on computer vision problems to improve the product search experience. In previous positions, he applied machine learning methods to fMRI and financial data. Christopher holds a master’s degree in statistics from Humboldt Universität Berlin.
For exhibition and sponsorship opportunities, email strataconf@oreilly.com
For information on trade opportunities with O'Reilly conferences, email partners@oreilly.com
View a complete list of Strata Data Conference contacts
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • confreg@oreilly.com