Presented By O'Reilly and Cloudera
Make Data Work
Feb 17–20, 2015 • San Jose, CA
Julien Le Dem

Julien Le Dem
Principal Engineer, WeWork

Website | @J_

Julien Le Dem is a Data Systems Engineer at Twitter. Previously he was a Principal Engineer at Yahoo. He contributes to a number of Hadoop-related projects including HCatalog and he’s a PMC member on Pig.

Sessions

1:30pm–2:10pm Friday, 02/20/2015
Hadoop Platform
Location: 210 C/G
Julien Le Dem (WeWork)
Average rating: *....
(1.40, 5 ratings)
Parquet is a columnar format designed to be efficient and interoperable across the hadoop ecosystem. Its integration in most processing frameworks and serialization models makes it easy to use in existing ETL and processing pipelines, while giving flexibility of choice on the query engine. Read more.
2:20pm–3:00pm Friday, 02/20/2015
Office Hour
Location: Table B
Julien Le Dem (WeWork)
Thinking about using Apache Parquet as a basis for ETL and analytics? A chat with Julien can save you a lot of wasted time. He’ll answer questions about integrating Parquet in existing ETL and processing pipelines, and using Parquet with a wide variety of data analysis tools like Spark, Impala, Pig, Hive, and Cascading. He’ll also discuss the future of Parquet. Read more.