Lyft is on a mission to improve people’s lives with the world’s best transportation. As part of this mission, Lyft invests heavily in open source infrastructure and tooling. Kubernetes has emerged as the next generation of cloud native infrastructure to support a wide variety of distributed workloads, while Apache Spark supports both machine learning and large-scale ETL workloads. By combining the flexibility of Kubernetes with the data-processing power of Apache Spark, Lyft is able to drive ETL data processing to a different level.
Li Gao and Bill Graham discuss the challenges the Lyft team faced and solutions they developed to support Apache Spark on Kubernetes in production and at scale.
Li Gao is the tech lead for the Cloud Native Spark Compute Initiative at Lyft. Previously, Li held technical leadership positions focusing on cloud native and hybrid cloud data platforms at scale at Salesforce, Fitbit, Marin Software, and a few startups. Besides Spark, Li has scaled and productionized open source projects including Presto, Apache HBase, Apache Phoenix, Apache Kafka, Apache Airflow, and Apache Hive.
Bill Graham is an architect on the data platform team at Lyft. Bill’s primary area of focus is on data processing applications and analytics infrastructure. Previously, he was a staff engineer on the data platform team at Twitter, where he built streaming compute, interactive query, batch query, ETL, and data management systems; a principal engineer at CBS Interactive and CNET Networks, where he developed ad targeting and content publishing infrastructure; and a senior engineer at Logitech focusing on webcam streaming and messaging applications. He’s contributed to a number of open source projects, including Apache HBase, Apache Hive, and Presto and is an Apache Pig and Apache Heron (incubating) PMC member.
©2019, O'Reilly Media, Inc. • (800) 889-8969 or (707) 827-7019 • Monday-Friday 7:30am-5pm PT • All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners. • email@example.com