HDP OPERATIONS: MIGRATING TO THE HORTONWORKS DATA PLATFORM – GTHDP08

Course Description

This course is designed for administrators who are familiar with administering other Hadoop distributions and are migrating to the Hortonworks Data Platform (HDP). It covers installation, configuration, maintenance, security and performance topics.

 

Course Objectives

  • Install and configure an HDP 2.x cluster
  • Use Ambari to monitor and manage a cluster
  • Mount HDFS to a local filesystem using the NFS Gateway
  • Configure Hive for Tez
  • Use Ambari to configure the schedulers of the ResourceManager
  • Commission and decommission worker nodes using Ambari
  • Use Falcon to define and process data pipelines
  • Take snapshots using the HDFS snapshot feature
  • Implement and configure NameNode HA using Ambari
  • Secure an HDP cluster using Ambari
  • Setup a Knox gateway

 

Format

50% Lecture/Discussion
50% Hands-on Labs

 

Certification

Hortonworks offers a comprehensive certification program that identifies you as an expert in Apache Hadoop. Visit hortonworks.com/training/certification for more information.

 

Hortonworks University

Hortonworks University is your expert source for Apache Hadoop training and certification. Public and private on-site courses are available for developers, administrators, data analysts and other IT professionals involved in implementing big data solutions. Classes combine presentation material with industry-leading hands-on labs that fully prepare students for real-world Hadoop scenarios.

^^

Duration

2 days

^^

Target Audience

Experienced Hadoop administrators and operators responsible for installing, configuring and supporting the Hortonworks Data Platform.

^^

Course Prerequisites

Attendees should be familiar with Hadoop fundamentals, have experience administering a Hadoop cluster, and installation of configuration of Hadoop components such as Sqoop, Flume, Hive, Pig and Oozie.

^^

Suggested Follow on Courses

There are various courses you could take depending on your business needs. Get in touch with us – we would be more than happy to discuss your training objectives with you.

^^

Course Content

Hands-On Labs

  • Install HDP 2.x using Ambari
  • Add a new node to the cluster
  • Stop and start HDP services
  • Mount HDFS to a local file system
  • Configure the capacity scheduler
  • Use WebHDFS
  • Dataset mirroring using Falcon
  • Commission and decommission a worker node using Ambari
  • Use HDFS snapshots
  • Configure NameNode HA using Ambari
  • Secure an HDP cluster using Ambari
  • Setting up a Knox gateway

^^

See more Hadoop courses