diff --git a/hadoop-unit-site/src/site/markdown/cli.md b/hadoop-unit-site/src/site/markdown/cli.md index 4139ce80..038696eb 100644 --- a/hadoop-unit-site/src/site/markdown/cli.md +++ b/hadoop-unit-site/src/site/markdown/cli.md @@ -19,7 +19,7 @@ Hadoop-unit can be used with common tools such as: hdfs dfs -ls hdfs://localhost:20112/ ``` -**For windows user, you could have some issue like `-classpath is not known`. The cause of these errors are because your `JAVA_HOME` has space into. If your `JAVA_HOME` is linked to `C:\Program File\Java\...` then declared it as `C:\Progra~1\Java\...` +**For windows user, you could have some issue like `-classpath is not known`. The cause of these errors are because your `JAVA_HOME` has space into. If your `JAVA_HOME` is linked to `C:\Program File\Java\...` then declared it as `C:\Progra~1\Java\...`**
# Kafka-console command @@ -30,12 +30,14 @@ hdfs dfs -ls hdfs://localhost:20112/ ```bash kafka-console-consumer --zookeeper localhost:22010 --topic topic ``` +
# HBase Shell * Download and unzip HBase * set variable `HBASE_HOME` * edit file `HBASE_HOME/conf/hbase-site.xml`: + ```xml @@ -54,6 +56,7 @@ kafka-console-consumer --zookeeper localhost:22010 --topic topic ```bash hbase shell ``` +
# Hive Shell diff --git a/hadoop-unit-site/src/site/markdown/focus.md b/hadoop-unit-site/src/site/markdown/focus.md index d3fbff19..5b1bed65 100644 --- a/hadoop-unit-site/src/site/markdown/focus.md +++ b/hadoop-unit-site/src/site/markdown/focus.md @@ -8,6 +8,7 @@ # Focus on ElasticSearch Under the hood, to run ElasticSearch, this is how it works: + * ElasticSearch is downloaded * A Java Process is run to start ElasticSearch @@ -19,6 +20,7 @@ To know which version of ElasticSearch has to be downloaded, the variable ```ela # Focus on Redis Under the hood, to run Redis, this is how it works: + * Redis is downloaded * An ant task is run to execute the ```make``` command @@ -32,5 +34,6 @@ To know which version of Redis has to be downloaded, the variable ```redis.versi # Focus on Oozie To use oozie, you need: + * to download the [oozie's share libs](http://s3.amazonaws.com/public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.1.0/tars/oozie/oozie-4.2.0.2.6.1.0-129-distro.tar.gz) * to edit the configuration file ```conf/hadoop-unit-default.properties``` and to set the variable ```oozie.sharelib.path``` to where you downloaded the oozie's share libs \ No newline at end of file diff --git a/hadoop-unit-site/src/site/markdown/howto-build.md b/hadoop-unit-site/src/site/markdown/howto-build.md index 418c3cbc..90112ef5 100644 --- a/hadoop-unit-site/src/site/markdown/howto-build.md +++ b/hadoop-unit-site/src/site/markdown/howto-build.md @@ -1,10 +1,12 @@ # Hadoop Unit build To build Hadoop Unit, you need: + * jdk 1.8 * maven 3.0+ Run: + ```bash mvn install -DskipTests ``` diff --git a/hadoop-unit-site/src/site/markdown/index.md b/hadoop-unit-site/src/site/markdown/index.md index e7c3681c..e2f56c34 100644 --- a/hadoop-unit-site/src/site/markdown/index.md +++ b/hadoop-unit-site/src/site/markdown/index.md @@ -45,6 +45,7 @@ Welcome to the Hadoop Unit wiki! * [Why Hadoop Unit](why-hadoop-unit.html) * Installation + * [Install the standalone Hadoop Unit mode](install-hadoop-unit-standalone.html) * [Integrate Hadoop Unit in your maven project](maven-usage.html) diff --git a/hadoop-unit-site/src/site/markdown/why-hadoop-unit.md b/hadoop-unit-site/src/site/markdown/why-hadoop-unit.md index ce662d48..f51f229b 100644 --- a/hadoop-unit-site/src/site/markdown/why-hadoop-unit.md +++ b/hadoop-unit-site/src/site/markdown/why-hadoop-unit.md @@ -14,9 +14,13 @@ Moreover, Hadoop Unit allow users to work on their laptop which bring a better f As Hadoop's components have not been designed to run in the same JVM, a lot of problems occurs. -In fact, Hadoop Unit run each component in his own classloader to avoid classpath issues. To do this, it is using maven resolver which is the dependency engine of maven and hadoop mini cluster which allows to make integration tests for hortonworks much easier. +In fact, Hadoop Unit run each component in his own classloader to avoid classpath issues (for standalone mode and maven integration plugin with mode embedded). To do this, it is using maven resolver which is the dependency engine of maven and hadoop mini cluster which allows to make integration tests for hortonworks much easier. In fact, hadoop mini cluster is using stuff like MiniDFS, LocalOozie which are available in the Hadoop world. +For the mode [Simple dependency usage](maven-usage.html#simple-dependency-usage), a *service provider interface* is used under the hood. This is why if, for example, you need hbase, it is not provided to add zookeeper in your test dependencies since the maven dependencies transitivity is used. + +Moreover, Hadoop Unit manage the right order for components' bootstrap for you. +