1) JDK 1.6或更新版本(本文使用JDK1.7,请不要安装JDK1.8版本,JDK1.8和Hadoop 2.4.0不匹配,编译Hadoop 2.4.0源码时会报很多错误)
2) Maven 3.0或更新版本
3) ProtocolBuffer 2.5.0
4) CMake 2.6或更新版本
/etc/profile
export M2_HOME=/usr/local/maven3
export M2=$M2_HOME/bin
export MAVEN_OPTS="-Xms256m -Xmx512m"
export JAVA_HOME=/usr/java/jdk1.7.0_15
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=/service:$M2:$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/usr/local/hadoop
export CMAKE_HOME=/usr/local
export PROTOC_HOME=/usr/local/protobuf
export MAVEN_HOME=/home/songwei/maven
export PATH=$HADOOP_HOME:$HADOOP_HOME/bin:$MAVEN_HOME/bin:$CMAKE_HOME/bin:$PROTOC_HOME/bin:$PATH
编译Hadoop源代码
完成上述准备工作后,即可通过执行命令:mvn package -Pdist -DskipTests -Dtar,启动对Hadoop源代码的编译。请注意一定不要使用JDK1.8。
如果需要编译成本地库(Native Libraries)文件,则使用命令:mvn package -Pdist,native -DskipTests -Dtar。如果C/C++程序需要访问HDFS等,需要使用navite方式编译生成相应的库文件。也可以使用mvn package -Pnative -DskipTests -Dtar特意编译出本地库文件。
相关的编译命令还有:
1) mvn package -Pdist -DskipTests -Dtar
2) mvn package -Pdist,native,docs,src -DskipTests -Dtar
3) mvn package -Psrc -DskipTests
4) mvn package -Pdist,native,docs -DskipTests -Dtar
5) mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
1. yarn-site.xml
yarn.nodemanager.aux-services
mapreduce_shuffle
yarn.nodemanager.aux-services.mapreduce.shuffle.class
org.apache.hadoop.mapred.ShuffleHandler
yarn.resourcemanager.address
10.228.254.135:8032
yarn.resourcemanager.scheduler.address
10.228.254.135:8030
yarn.resourcemanager.resource-tracker.address
10.228.254.135:8031
yarn.resourcemanager.admin.address
10.228.254.135:8033
yarn.resourcemanager.webapp.address
10.228.254.135:8088
2. mapred-site.xml
mapreduce.framework.name
yarn
mapreduce.jobhistory.address
10.228.254.135:10020
mapreduce.jobhistory.webapp.address
10.228.254.135:19888
3. hdfs-site.xml
dfs.namenode.http-address
10.228.254.135:9001
dfs.namenode.name.dir
file:/usr/local/hadoop/dfs/name
dfs.datanode.data.dir
file:/usr/local/hadoop/dfs/data
dfs.replication
1
dfs.webhdfs.enabled
true
4. core-site.xml
fs.defaultFS
hdfs://10.228.254.135:9000
io.file.buffer.size
131072
hadoop.tmp.dir
file:/usr/local/hadoop/tmp
Abase for other temporary directories.
hadoop.proxyuser.hduser.hosts
*
hadoop.proxyuser.hduser.groups
*
阅读(2525) | 评论(0) | 转发(0) |