Flume and Sqoop for Ingesting Big Data
Flume and Sqoop for Ingesting Big Data
Taught by a team which includes 2 Stanford-educated, ex-Googlers. This team has decades of practical experience in working with Java and with billions of rows of data.
Use Flume and Sqoop to import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL
Let’s parse that.
Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems.
Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase.
Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive.
What's Covered:
Practical implementations for a variety of sources and data stores ..
- Sources : Twitter, MySQL, Spooling Directory, HTTP
- Sinks : HDFS, HBase, Hive
.. Flume features :
Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors
.. Sqoop features :
Sqoop import from MySQL, Incremental imports using Sqoop Jobs
Import data to HDFS, HBase and Hive from a variety of sources , including Twitter and MySQL
Url: View Details
What you will learn
- Use Flume to ingest data to HDFS and HBase
- Use Sqoop to import data from MySQL to HDFS and Hive
- Ingest data from a variety of sources including HTTP, Twitter and MySQL
Rating: 3.5
Level: All Levels
Duration: 2.5 hours
Instructor: Loony Corn
Courses By: 0-9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
About US
The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or endorsement of coursescompany.com.
View Sitemap