HDP Operations Migrating to the Hortonworks Data Platform

Course Overview

This course is designed for administrators who are familiar with administering other Hadoop distributions and are migrating to the Hortonworks Data Platform (HDP). It covers installation, configuration, maintenance, security and performance topics.

Target Audience

Experienced Hadoop administrators and operators responsible for installing, configuring and supporting the Hortonworks Data Platform.

Prerequisites

Attendees should be familiar with Hadoop fundamentals, have experience administering a Hadoop cluster, and installation of configuration of Hadoop components such as Sqoop, Flume, Hive, Pig and Oozie.

Course Objectives

After attending, students will be able to:

  • Install and configure an HDP 2.x cluster
  • Use Ambari to monitor and manage a cluster
  • Mount HDFS to a local filesystem using the NFS Gateway
  • Configure Hive for Tez
  • Use Ambari to configure the schedulers of the ResourceManager
  • Commission and decommission worker nodes using Ambari
  • Use Falcon to define and process data pipelines
  • Take snapshots using the HDFS snapshot feature
  • Implement and configure NameNode HA using Ambari
  • Secure an HDP cluster using Ambari
  • Setup a Knox gateway

Format
50% Lecture/Discussion
50% Hands-on Labs

Course Outline

Hands-On Labs

  • Install HDP 2.x using Ambari
  • Add a new node to the cluster
  • Stop and start HDP services
  • Mount HDFS to a local file system
  • Configure the capacity scheduler
  • Use WebHDFS
  • Dataset mirroring using Falcon
  • Commission and decommission a worker node using Ambari
  • Use HDFS snapshots
  • Configure NameNode HA using Ambari
  • Secure an HDP cluster using Ambari
  • Setting up a Knox gateway

SLI Main Menu