docker pull e4glet/hadoop:2.0 (only include hadoop)
docker pull e4glet/hadoop:2.1(include hadoop and spark)
tag:2.0
OS:Ubuntu16.04
JDK Version:openjdk8
Hadoop Version:2.7.2
tag:2.1
OS:Ubuntu16.04
JDK Version:openjdk8
Hadoop Version:2.7.2
Spark Version:2.1.3
This image can be used for develop with Eclipse JDK 1.8.0_191
docker pull e4glet/hadoop:2.0
git clone https://github.com/e4glet/hadoop-cluster-docker2.0
docker network create --driver=bridge hadoop
cd hadoop-cluster-docker2.0
Add execute permission to the script file
chmod 777 resize-cluster.sh
like this:
./resize-cluster.sh 5
./start-container.sh 5
result:
start hadoop-master container...
start hadoop-slave1 container...
start hadoop-slave2 container...
start hadoop-slave3 container...
start hadoop-slave4 container...
root@hadoop-master:~#
root@hadoop-master:~#./start-hadoop.sh
./remove-container.sh 5
How to config in Eclipse
You can Define hadoop location like this on Eclipse 2018:
How to upload the testapp in your container
You will need to mount a Docker volume to every location where hadoop-master writes information.The default hadoop-master requires write access to $PWD/hadoop and /root/hadoop.
Loot at your start-container.sh
-v $PWD/hadoop:/root/hadoop
If you have a more advanced application that requires hadoop-master to write to other locations, simply add more volume mounts to those locations.
like this:
That's so cool.AlL right?
new tags:2.1 include spark 2.1.3
start: Is the same like tags:2.0
spark Web UI : http://HostIP:8040/
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。