Chinaunix首页 | 论坛 | 博客
  • 博客访问: 8049482
  • 博文数量: 594
  • 博客积分: 13065
  • 博客等级: 上将
  • 技术积分: 10324
  • 用 户 组: 普通用户
  • 注册时间: 2008-03-26 16:44
个人简介

推荐: blog.csdn.net/aquester https://github.com/eyjian https://www.cnblogs.com/aquester http://blog.chinaunix.net/uid/20682147.html

文章分类

全部博文(594)

分类: HADOOP

2015-11-27 10:44:44

hadoop提供了CMake来编译libhdfs,因此在编译之前需要先安装好CMake工具。


然后进入libhdfs的源代码目录,如:/usr/local/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src


执行cmake以生成Makefile文件(假设jdk的安装目录为/usr/local/jdk1.7.0_55):
cmake -DGENERATED_JAVAH=/usr/local/jdk1.7.0_55 -DJAVA_HOME=/usr/local/jdk1.7.0_55 .


成功之后,会在目录下生成Makefile文件,接下来就可以执行make编译生成libhdfs.so和libhdfs.a了。


如果遇到下面这样的错误:
/usr/local/jdk1.7.0_55/jre/lib/amd64/server/libjvm.so: file not recognized: File format not recognized


则需要考虑升级链接器ld,参考说明:http://blog.chinaunix.net/uid-20682147-id-4239779.html。
ld是GNU binutils的成员,可以从下载到新的版本。
注意在升级gcc和ld之后,需要更新下环境变量PATH,再重执行cmake,否则可能引用的仍然是老版本的gcc和ld。


/usr/local/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src # cmake -DGENERATED_JAVAH=/usr/local/java_1_7 -DJAVA_HOME=/usr/local/java_1_7
-- The C compiler identification is GNU 4.1.2
-- The CXX compiler identification is GNU 4.1.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
JAVA_HOME=/usr/local/java_1_7, JAVA_JVM_LIBRARY=/usr/local/java_1_7/jre/lib/amd64/server/libjvm.so
JAVA_INCLUDE_PATH=/usr/local/java_1_7/include, JAVA_INCLUDE_PATH2=/usr/local/java_1_7/include/linux
Located all JNI components successfully.
-- Performing Test HAVE_BETTER_TLS
-- Performing Test HAVE_BETTER_TLS - Success
-- Performing Test HAVE_INTEL_SSE_INTRINSICS
-- Performing Test HAVE_INTEL_SSE_INTRINSICS - Success
-- Looking for dlopen in dl
-- Looking for dlopen in dl - found
-- Found JNI: /usr/local/java_1_7/jre/lib/amd64/libjawt.so  
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.20") 
-- checking for module 'fuse'
--   package 'fuse' not found
-- Failed to find Linux FUSE libraries or include files.  Will not build FUSE client.
-- Configuring done
-- Generating done
-- Build files have been written to: /usr/local/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/src


接下来就可以执行make来编译生成libhdfs.a了。
没有提供install,需要手工安装头文件hdfs.h和库文件libhdfs.a。


注1:hadoop-2.8.0的编译目录为:
hadoop-2.8.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src


注2:如遇到错误“JVM_ARCH_DATA_MODEL is not defined”,解决方法:
cmake -DCMAKE_BUILD_TYPE=Debug -DCMAKE_INSTALL_PREFIX=/usr/local/hdfs-2.8.0 -DJVM_ARCH_DATA_MODEL=64

阅读(4280) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~