Presented By O'Reilly and Cloudera
Make Data Work
September 25–26, 2017: Training
September 26–28, 2017: Tutorials & Conference
New York, NY

Key big data architectural considerations for deploying in the cloud and on-premises (sponsored by NetApp)

1:15pm1:55pm Thursday, September 28, 2017
Location: 1A 04/05
Average rating: *....
(1.00, 1 rating)

What you'll learn

  • Explore big data architectural challenges and learn how to address them to create a cost-optimized solution for the rapid deployment of business-critical applications that meet corporate SLAs


Architects have the choice of designing an analytics platform in the data center or public cloud infrastructure or leveraging a public cloud service such as Azure HDInsight or EMR. Many times, the location where data is generated determines the location of analytics services, even though it may not be the most cost-effective, flexible, or performant solution. Another challenge is that when the analytics applications become business critical, SLAs for application performance, backup, and disaster recovery for these solutions can be challenging to meet at very large (petabyte) scale. Additionally, some analytics services may only be available in certain clouds while the rest of the analytics happen in other clouds or in the data center—to deploy a comprehensive, cohesive solution, the data needs to be able to flow between multiple clouds and your data center. Finally, as more applications are built on top of Hadoop data, creating dev and test environments and validation using production data may also pose a problem. To make matters worse, data can be present in different formats, such as files or objects, and moving this data into HDFS format requires a data transformation effort. These challenges make big data deployments long and expensive.

Karthikeyan Nagalingam discusses big data architectural challenges and how to address them and explains how to create a cost-optimized solution for the rapid deployment of business-critical applications that meet corporate SLAs today and into the future.

This session is sponsored by NetApp.

Photo of karthikeyan nagalingam

karthikeyan nagalingam


Karthikeyan Nagalingam is a senior technical marketing engineer at NetApp, where he is responsible for defining and developing big data analytics data protection technologies, producing best practices documentation, and helping customers implement Hadoop and NoSQL solutions. Karthikeyan has extensive experience architecting Hadoop solutions in the cloud, hybrid cloud, and on-premises and deploying and developing in Linux environments. He has developed numerous proofs of concept, worked with customers on deploying Hadoop solutions, and spoken at many industry, customer, and partner events. He holds a patent for distributed data storage and processing techniques. Karthikeyan holds an MS in software systems from Birla Institute of Technology and Science and a bachelor of engineering from SriRam Engineering College.