Life-cycle: Internal working of HDFS, SQOOP, HIVE, SPARK, HBASE, KAFKA with code.
-
Updated
Sep 10, 2019 - Shell
Life-cycle: Internal working of HDFS, SQOOP, HIVE, SPARK, HBASE, KAFKA with code.
Hadoop3.2 single/cluster mode with web terminal gotty, spark, jupyter pyspark, hive, eco etc.
Instructions on setting up Hadoop, HDFS, java, sbt, kafka, scala, spark and flume on Ubuntu 18.04
EMR 5.25.0 cluster single node Hadoop docker image. With Amazon Linux, Hadoop 2.8.5 and Hive 2.3.5
Ambiente com o objetivo de praticar o uso das ferramentas Ansible e Hadoop usando uma única instância
Add a description, image, and links to the hadoop-ecosystem topic page so that developers can more easily learn about it.
To associate your repository with the hadoop-ecosystem topic, visit your repo's landing page and select "manage topics."