Harnessing the power of data is the future of business. GoDaddy is constantly trying to improve its customer experience and internal operations by understanding massive amounts of data and is on the path to transform its business by leveraging Hadoop in conjunction with an enterprise-wide, Kafka-backed data ingest pipeline along with Elasticsearch, Spark, and Cassandra to perform anomaly detection, real-time log visualization, alerting, remediation, and batch reporting on hundreds of thousands of events per second across its products and IT data.
Felix Gorodishter shares GoDaddy’s big data journey from a farm of data silos to a centralized platform capable of supporting data ingest and visualization throughout its enterprise. Learn how GoDaddy collects and manages its data, which ranges from business units like hosting and domains to network and hardware events across its fleet of servers and network devices.
As GoDaddy was transforming its data ingest, it also took on the challenge of understanding what it was collecting in order to answer key business questions, such as:
Felix discusses how GoDaddy went about answering those questions by leveraging a wide range of technologies including Kafka, Spark, Hadoop, Elasticsearch, and other open source tools.
Felix Gorodishter is a software architect at GoDaddy. Felix is a web developer, technologist, entrepreneur, husband, and daddy.
©2017, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com
Apache Hadoop, Hadoop, Apache Spark, Spark, and Apache are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries, and are used with permission. The Apache Software Foundation has no affiliation with and does not endorse, or review the materials provided at this event, which is managed by O'Reilly Media and/or Cloudera.