博客是我工作的好帮手,遇到困难就来博客找资料
分类: 系统运维
2017-01-24 16:41:02
一、安装JDK
1.7 JDK下载地址:
下载后安装
[plain] view plain copy
rpm -ivh jdk-8u112-linux-x64.rpm
设置JDK环境变量
[plain] view plain copy
export JAVA_HOME=/usr/java/jdk1.8.0_112
export CLASSPATH=$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH
二、安装
1、DNS绑定
vi /etc/hosts,增加一行内容,如下(这里我的Master节点IP设置的为192.168.80.100):
[plain] view plain copy
192.168.80.100 IMM-SJJ01-Server18
2、SSH的免密码登录
[plain] view plain copy
cd /home/data/.ssh
ssh-keygen -t rsa
cat id_rsa.pub >> authorized_keys
3、安装Hadoop
[plain] view plain copy
#
wget
cd /home/game/soft
tar zxvf hadoop-2.7.3.tar.gz
ln -s /home/game/soft/hadoop-2.7.3 /home/game/soft/hadoop
#4、配置
1) 设置Hadoop环境变量
[plain] view plain copy
vim ~/.bash_profile 或 /etc/profile
export HADOOP_HOME=/home/game/soft/hadoop
export PATH=$HADOOP_HOME/bin:$PATH
echo $HADOOP_HOME
2)修改hadoop-env.sh
[plain] view plain copy
vim $HADOOP_HOME/etc/hadoop/hadoop-env.sh
export JAVA_HOME=${JAVA_HOME} 改为
export JAVA_HOME=/usr/java/jdk1.8.0_112
3)修改/etc/hosts
4)修改core-site.xml
[plain] view plain copy
cd $HADOOP_HOME
cp ./share/doc/hadoop/hadoop-project-dist/hadoop-common/core-default.xml ./etc/hadoop/core-site.xml
cp ./share/doc/hadoop/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml ./etc/hadoop/hdfs-site.xml
cp ./share/doc/hadoop/hadoop-yarn/hadoop-yarn-common/yarn-default.xml ./etc/hadoop/yarn-site.xml
cp ./share/doc/hadoop/hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml ./etc/hadoop/mapred-site.xml
vim $HADOOP_HOME/etc/hadoop/core-site.xml
fs.default.name
hdfs://192.168.80.100:19000
hadoop.tmp.dir
/home/game/hadoop/tmp
5)修改配置hdfs-site.xml
[plain] view plain copy
dfs.namenode.rpc-address
192.168.80.100:19001
dfs.namenode.http-address
0.0.0.0:10070
6)修改mapred-site.xml
[plain] view plain copy
cp mapred-site.xml.template mapred-site.xml
mapreduce.framework.name
yarn
7)修改yarn-site.xml
[plain] view plain copy
The http address of the RM web application.
yarn.resourcemanager.webapp.address
${yarn.resourcemanager.hostname}:18088
5、启动
1)格式化NameNode
cd $HADOOP_HOME/bin
./hdfs namenode -format
#2)启动hdfs
/home/game/soft/hadoop/sbin/start-dfs.sh
jps查看是否启动成功
16704 DataNode
16545 NameNode
16925 SecondaryNameNode
hdfs dfs -ls hdfs://192.168.80.100:19001/
#3) 启动yarn
/home/game/hadoop-2.7.3/sbin/start-yarn.sh
[game@IM-SJ01-Server18 sbin]$ jps
17427 NodeManager
19668 ResourceManager
yarn node -list
yarn node -status
#4)页面显示
192.168.80.100:10070
192.168.80.100:18088
#6、上传测试
hadoop fs -mkdir -p hdfs://192.168.80.100:19001/test/
hadoop fs -copyFromLocal ./test.txt hdfs://192.168.80.100:19001/test/
hadoop fs -ls hdfs://192.168.80.100:19001/
hadoop fs -put /opt/program/userall20140828 hdfs://localhost:9000/tmp/tvbox/