Taming Big Data with Apache Spark and Python - Hands On!

4.5 (8980)
MOOC
Payment
Learning paid
Language
English
Duration
7 hours course
Course by Udemy
What will you learn?
Use DataFrames and Structured Streaming in Spark 3
Frame big data analysis problems as Spark problems
Use Amazon's Elastic MapReduce service to run your job on a cluster with Hadoop YARN
Install and run Apache Spark on a desktop computer or on a cluster
Use Spark's Resilient Distributed Datasets to process and analyze large data sets across many CPU's
Implement iterative algorithms such as breadth-first-search using Spark
Use the MLLib machine learning library to answer common data mining questions
Understand how Spark SQL lets you work with structured data
Understand how Spark Streaming lets your process continuous streams of data in real time
Tune and troubleshoot large jobs running on a cluster
Share information between nodes on a Spark cluster using broadcast variables and accumulators
Understand how the GraphX library helps with network analysis problems
About the course

New! Updated for Spark 3, more hands-on exercises, and a stronger focus on DataFrames and Structured Streaming.

“Big data" analysis is a hot and highly valuable skill – and this course will teach you the hottest technology in big data: Apache Spark. Employers including Amazon, EBay, NASA JPL, and Yahoo all use Spark to quickly extract meaning from massive data sets across a fault-tolerant Hadoop cluster. You'll learn those same techniques, using your own Windows system right at home. It's easier than you might think.

Learn and master the art of framing data analysis problems as Spark problems through over 20 hands-on examples, and then scale them up to run on cloud computing services in this course. You'll be learning from an ex-engineer and senior manager from Amazon and IMDb.

  • Learn the concepts of Spark's DataFrames and Resilient Distributed Datastores
  • Develop and run Spark jobs quickly using Python
  • Translate complex analysis problems into iterative or multi-stage Spark scripts
  • Scale up to larger data sets using Amazon's Elastic MapReduce service
  • Understand how Hadoop YARN distributes Spark across computing clusters
  • Learn about other Spark technologies, like Spark SQL, Spark Streaming, and GraphX

By the end of this course, you'll be running code that analyzes gigabytes worth of information – in the cloud – in a matter of minutes. 

This course uses the familiar Python programming language; if you'd rather use Scala to get the best performance out of Spark, see my "Apache Spark with Scala - Hands On with Big Data" course instead.

We'll have some fun along the way. You'll get warmed up with some simple examples of using Spark to analyze movie ratings data and text in a book. Once you've got the basics under your belt, we'll move to some more complex and interesting tasks. We'll use a million movie ratings to find movies that are similar to each other, and you might even discover some new movies you might like in the process! We'll analyze a social graph of superheroes, and learn who the most “popular" superhero is – and develop a system to find “degrees of separation" between superheroes. Are all Marvel superheroes within a few degrees of being connected to The Incredible Hulk? You'll find the answer.

This course is very hands-on; you'll spend most of your time following along with the instructor as we write, analyze, and run real code together – both on your own system, and in the cloud using Amazon's Elastic MapReduce service. 7 hours of video content is included, with over 20 real examples of increasing complexity you can build, run and study yourself. Move through them at your own pace, on your own schedule. The course wraps up with an overview of other Spark-based technologies, including Spark SQL, Spark Streaming, and GraphX.

Wrangling big data with Apache Spark is an important skill in today's technical world. Enroll now!

  • " I studied "Taming Big Data with Apache Spark and Python" with Frank Kane, and helped me build a great platform for Big Data as a Service for my company. I recommend the course!  " - Cleuton Sampaio De Melo Jr.
Program
Getting Started with Spark
Set up a working development environment for Spark with Python on your desktop.
Introduction

Meet your instructor, and we'll review what this course will cover and what you need to get started.

How to Use This Course
How to find the scripts and data associated with the lectures in this course.
Udemy 101: Getting the Most From This Course
Warning about Java 11 and Spark 2.4.0!
While setting things up, do NOT install Java 9, 10, or 11 - it's not compatible with Spark yet. Scroll down on the JDK download page, and install a JDK for Java 8 instead.
[Activity]Getting Set Up: Installing Python, a JDK, Spark, and its Dependencies.
We'll install Enthought Canopy, a JDK, and Apache Spark on your Windows system. When we're done, we'll run a simple little Spark script on your desktop to test it out!
[Activity] Installing the MovieLens Movie Rating Dataset
Before we can analyze data with Spark, we need some data to analyze! Let's install the MovieLens dataset of movie ratings, which we'll use throughout the course.
[Activity] Run your first Spark program! Ratings histogram example.

We'll run a simple Spark script using Python, and analyze the 100,000 movie ratings you installed in the previous lecture. What is the breakdown of the rating scores in this data set? You'll find it's easy to find out!

Spark Basics and Simple Examples
Understand Sparks' Resilient Distributed Dataset objects and some examples of using them.
Introduction to Spark
This high-level introduction will help you understand what Spark is for, who's using it, and why it's such a big deal.
The Resilient Distributed Dataset (RDD)
Understand the core object of Spark: the Resilient Distributed Dataset (RDD), and how you can use Spark to transform and perform actions upon RDD's.
Ratings Histogram Walkthrough

We'll dissect our original ratings histogram Spark example, and understand exactly how every line of it works!

Requirements
  • Access to a personal computer. This course uses Windows, but the sample code will work fine on Linux as well.
  • Some prior programming or scripting experience. Python experience will help a lot, but you can pick it up as we go.
Lecturers
Sundog Education by Frank Kane
Sundog Education by Frank Kane
Founder, Sundog Education. Machine Learning Pro
Frank Kane
Frank Kane
Founder, Sundog Education
Platform
/storage/img/providers/udemy.svg
Udemy courses are suited to professional development. The platform is organized in such a way that it is experts themselves that decide the topic and when the course will start. All supporting documents are made available to you for lifetime access. On this platform, you can find a course on about any subject, and that is no exaggeration – from a tutorial on how to ride a motorcycle, to managing the financial markets. The language and the course format are established by the teacher. This is why it is important to read the information about the course carefully before parting with any money.
Rating
4.5
(4789)
(3361)
(790)
(127)
(77)
Comments (8980)
Like any other website, konevy uses «cookies». These cookies are used to store information including visitor's preferences, and the pages on the website that the visitor accessed or visited. The information is used to optimize the users' experience by customizing our web page content based on visitors' browser type and/or other information. For more general information on cookies, please read the «What Are Cookies» article on Cookie Consent website.