Big Data Boot Camp
This interactive big data course will provide a technical overview of Apache Hadoop for project managers, business managers, and data analysts. Students will understand the overall big data space, technologies involved and will get a detailed overview of Apache Hadoop. The course will expose students to real-world use cases to comprehend the capabilities of Apache Hadoop. Students will also learn about YARN and HDFS and how to develop applications and analyze Big Data stored in Apache Hadoop using Apache Pig and Apache Hive. Each topic will provide hands-on experience to the students.
The big data course is developed and taught by certified Hadoop consultants who have a passion for teaching and help deliver value to various clients using Big Data and Hadoop technologies on a daily basis.
Duration
2 days/16 hours of instructionEducation Credits
14 PDUsPublic Classroom Pricing
$1595(USD)
GSA Price: $1485
Group Rate: $1495
Private Group Pricing
Have a group of 5 or more students? Request special pricing for private group training today.
Download the Course Brochure
Part 1: Introduction to Big Data
- Big Data – beyond the obvious trends
- Technologies involved
- Business drivers
- Implications for enterprise computing
- Exponentially increasing data
- ERP Data
- CRM Data
- Web Data
- Big Data
- Big data sources
- Sensors
- Social
- Geospatial
- Video
- Machine to machine
- Others
- Data warehousing, business intelligence, analytics, predictive statistics, data science
Part 2: Survey of Big Data technologies
- First generation systems
- RDBMS systems
- ETL systems
- BI systems
- Second generation systems
- Columnar databases with compression
- MPP architectures
- Data warehousing appliances
- Enterprise search
- Visualizing and understanding data with processing
- Streaming processing
- Statistical processing
- Data visualization
- NOSQL databases
- How do technologies like MongoDB, MarkLogic, and CouchDB fit in?
- What is polyglot persistence?
- Apache Hadoop
Part 3: Introduction to Hadoop
- What is Hadoop? Who are the major vendors?
- A dive into the Hadoop Ecosystem
- Benefits of using Hadoop
- How to use Hadoop within your infrastructure?
- Where do we use Hadoop?
- Where do we look at options besides Hadoop?
Part 4: Introduction to MapReduce
- What is MapReduce?
- Why do you need MapReduce?
- Using MapReduce with Java and Ruby
Lab: How to use MapReduce in Hadoop?
Part 5: Introduction to Yarn
- What is Yarn?
- What are the advantages of using Yarn over classical MapReduce?
- Using Yarn with Java and Ruby
Lab: How to use Yarn within Hadoop?
Part 6: Introduction to HDFS
- What is HDFS?
- Why do you need a distributed file system?
- How is a distributed file system different from a traditional file system?
- What is unique about HDFS when compared to other file systems?
- HDFS and reliability?
- Does it offer support for compressions, checksums and data integrity?
Lab: Overview of HDFS commands
Part 7: Data Transformation
- Why do you need to transform data?
- What is Pig?
- Use cases for Pig
Lab: Hands-on activities with Pig
Part 8: Structured Data Analysis?
- How do you handle structured data with Hadoop?
- What is Hive/HCatalog?
- Use cases for Hive/HCatalog
Lab: Hands-on activities with Hive/HCatalog
Part 9: Loading data into Hadoop
- How do you move your existing data into Hadoop?
- What is Sqoop?
Lab: Hands-on activities with Sqoop
Part 10: Automating workflows in Hadoop
- Benefits of Automation
- What is oozie?
- Automatically running workflows
- Setting up workflow triggers
Lab: Demonstration of oozie
Part 11: Exploring opportunities in your own organization
- Framing scenarios
- Understanding how to ask questions
- Tying possibilities to your own business drivers
- Common opportunities
- Real world examples
Hands-on Exercises
You'll experience "in-the-trenches" practice built around actual big data implementations. You'll learn to avoid pitfalls and do it right the first time. Your instructor will help you map the tools and techniques you learn in this class to your own business, so they can be applied in your own organization immediately after the class.
How to use MapReduce in Hadoop?
- How does it work from languages like Java?
- How does it work with languages like Ruby?
How to use Yarn within Hadoop?
- How does it work from languages like Java?
- How does it work with languages like Ruby?
Overview of HDFS commands
- Standard file system commands
- Moving data to and from HDFS
Hands-on activities with Pig
- Joining Data
- Filtering Data
- Storing and Loading Data
Hands-on activities with Hive/HCatalog
- Storing and Loading Data
- Select expressions
- Hive vs SQL
Hands-on activities with Sqoop
- Running evaluation commands with Sqoop
- Importing data from relational databases
- Exporting data to relational databases
Demonstration of Oozie
- Creating a workflow
- Running a workflow automatically at regular intervals
- Running a workflow automatically when some events are triggered
Anybody who is involved with databases, data analysis, wondering how to deal with the mountains of data (anywhere gigabytes of user/log data, etc to petabytes will benefit from this program.
This Big Data Course is Perfect For:
- Business Analysts
- Software Engineers
- Project Managers
- Data Analysts
- Business Customers
- Team Leaders
- System Analysts
- Learn about the big data ecosystem
- Understand the benefits and ROI you can get from your existing data
- Learn about Hadoop and how it is transforming the workspace
- Learn about MapReduce and Hadoop Distributed File system
- Learn how to use Hadoop to identify new business opportunities
- Learn about using Hadoop to improve data management processes
- Learn about using Hadoop to clarify results
- Learn about using Hadoop to expand your data sources
- Learn how to scale your current workflow to handle more users and lower your overall performance cost
- Learn about the various technologies that comprise the Hadoop ecosystem