Hadoop 2.2.0 编译
使用官网上下载的二进制版本,测试单机HDFS通过了。但是老是有warnings报出来。
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
原因是官方的二进制版本中native库是32位的。太蛋疼了。所以需要重新编译。用file命令可以看出该文件是32位的。
$ file $HADOOP_PREFIX/lib/native/libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, BuildID[sha1]=0x9eb1d49b05f67d38454e42b216e053a27ae8bac9, not stripped
1. 安装以下系统软件:
yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-devel
2. 安装maven
使用3.0.5版本
# tar zxf apache-maven-3.0.5-bin.tar.gz
# ln -s apache-maven-3.0.5 maven
/etc/profile文件中添加以下内容:
export MAVEN_HOME=/usr/local/maven
export PATH=${MAVEN_HOME}/bin:$PATH
source /etc/profile
3. 安装ant
wget
tar zxf apache-ant-1.9.3-bin.tar.gz -C /usr/local/
vim /etc/profile
export ANT_HOME=/opt/apache-ant-1.9.3
export PATH=$PATH:$ANT_HOME/bin
source /etc/profile
4. 安装Findbugs
wget
tar zxf findbugs-2.0.3.tar.gz -C /usr/local/
vim /etc/profile
export FINDBUGS_HOME=/opt/findbugs-2.0.3
export PATH=$PATH:$FINDBUGS_HOME/bin
source /etc/profile
5. 安装protobuf
一定需要protobuf 2.5.0以上.
wget
$ tar zxf protobuf-2.5.0.tar.gz
$ cd protobuf-2.5.0
$ ./configure
$ make
$ make install
6.下载hadoop源码
wget
tar zxf hadoop-2.2.0-src.tar.gz
7. 打patch
8. 编译
source /etc/profile
mvn package -DskipTests -Pdist,native -Dtar
报错:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
原因是,protoc是我在root用户下编译的,hadoop用户找不到。
alias protoc='/usr/local/bin/protoc'
protoc --version
libprotoc 2.5.0
重新编译
source /etc/profile
mvn package -DskipTests -Pdist,native -Dtar
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile (default-compile) on project hadoop-hdfs: Fatal error compiling: Error while executing the compiler. InvocationTargetException: Java heap space -> [Help 1]
这个据说是内存太小,导致的,我改大内存试试吧.
还真的编译通过了。真心不容易啊。
编译好的文件在,hadoop-dist/target/目录下。
替换native包:
查看新编译的native包:
# file hadoop-2.2.0/lib/native/libhadoop.so.1.0.0
hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
替换掉官方下载的这个包。
然后启动hadoop,发现之前的warnings终于没了。
阅读(3030) | 评论(0) | 转发(0) |