# jdk+hadoop+hbase+spark的搭建 **Repository Path**: lonsonlo/hadoopSparkHbaseEnv ## Basic Information - **Project Name**: jdk+hadoop+hbase+spark的搭建 - **Description**: jdk+hadoop+hbase+spark的搭建 这是一个已经搭建好的,服务器到期之后的,配置保存 - **Primary Language**: Java - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 1 - **Created**: 2020-07-12 - **Last Updated**: 2020-12-18 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # jdk+hadoop+hbase+spark的搭建 jdk+hadoop+hbase+spark的搭建,这是一个已经搭建好的,服务器到期之后的,配置保存 hosts: 118.31.38.25 work1 101.37.147.47 work2 47.96.11.140 master 修改各机器的host和hosts 配置ssh免密登陆 全部的环境变量: export HADOOP_HOME=/root/hadoop-2.7.7 export PATH=$PATH:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/sbin export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH export HBASE_HOME=/root/hbase-1.4.11 export PATH=$PATH:$HBASE_HOME/bin export SPARK_HOME=/root/spark-2.2.3-bin-hadoop2.7 export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH JAVA_HOME=/root/jdk1.8.0_231 CLASSPATH=$JAVA_HOME/lib/ PATH=$PATH:$JAVA_HOME/bin export PATH JAVA_HOME CLASSPATH 将压缩包解压到root目录下,各应用的配置文件基本可以使用了。前提是配置好hostname 、hosts 压缩包地址: 链接:https://pan.baidu.com/s/1bJAW1Ras7K7mnL1X5EeWjQ 提取码:2eos 集群页面 hafs http://master:50070/dfshealth.html#tab-overview yarn http://master:8088/cluster/nodes hbase http://master:16010/master-status spark http://master:8080/ 要开放的端口号: 7077/7077 16030/16030 16000/16000 16020/16020 2181/2181 16010/16010 50070/50070 3000/10000