Presented By
O’Reilly + Cloudera
Make Data Work
March 25-28, 2019
San Francisco, CA
Please log in

How to protect big data in a containerized environment

Thomas Phelan (HPE BlueData)
11:00am11:40am Thursday, March 28, 2019
Average rating: ****.
(4.50, 2 ratings)

Who is this presentation for?

  • Data architects, data security experts, and those working in big data, AI, ML, and DL data security

Level

Intermediate

What you'll learn

  • Understand why data security is important in a big data environment
  • Learn the most common methods of data security
  • Discover why HDFS Transparent Data Encryption is essential to securing big data in the enterprise
  • Explore the common pitfalls of implementing HDFS TDE in bare-metal and virtualizied big data environments and see how to mitigate them

Description

Every enterprise spends significant resources to protect its data. This is especially true in the case of big data, since some of this data may include sensitive or confidential customer and financial information. Common methods for protecting data include permissions and access controls, as well as the encryption of data at rest and in flight.

The Hadoop community has recently rolled out Transparent Data Encryption (TDE) support in HDFS. Transparent Data Encryption refers to the process whereby data is transparently encrypted by the big data application writing the data; it isn’t decrypted again until it’s accessed by another application. The data is encrypted during its entire lifespan—in transit and at rest—except when it’s being specifically accessed by a processing application.

TDE is an excellent approach for protecting data stored in data lakes built on the latest versions of HDFS. However, it comes with its own challenges and limitations. Systems that want to use TDE require tight integration with enterprise-wide Kerberos Key Distribution Center (KDC) services and Key Management Systems (KMS). This integration isn’t easy to set up or maintain. These issues can be even more challenging in a virtualized or containerized environment where one Kerberos realm may be used to secure the big data compute cluster and a different Kerberos realm may be used to secure the HDFS file system accessed by this cluster.

BlueData has developed significant expertise in configuring, managing, and optimizing access to TDE-protected HDFS. Thomas Phelan offers a detailed description of how Transparent Data Encryption works with HDFS, with a particular focus on containerized environments. You’ll learn how HDFS TDE is configured and maintained in an environment where many big data frameworks run simultaneously (e.g., in a hybrid cloud architecture using Docker containers). You’ll also discover how to manage KDC credentials in a Kerberos cross-realm environment to provide data scientists and analysts with the greatest flexibility in accessing data while maintaining complete enterprise-grade data security.

Photo of Thomas Phelan

Thomas Phelan

HPE BlueData

Thomas Phelan is cofounder and chief architect of BlueData. Previously, a member of the original team at Silicon Graphics that designed and implemented XFS, the first commercially availably 64-bit file system; and an early employee at VMware, a senior staff engineer and a key member of the ESX storage architecture team where he designed and developed the ESX storage I/O load-balancing subsystem and modular pluggable storage architecture as well as led teams working on many key storage initiatives such as the cloud storage gateway and vFlash.