脚踏实地、勇往直前!
全部博文(1005)
分类: HADOOP
2014-10-28 13:42:27
linux下安装hive
环境:
OS:Rad Hat Linux As5
Hive 0.9: 1.2
1.安装步骤
下载安装介质,下载地址为:
根据情况选择下载的版本,我这里下载的版本是hive-0.9.0.tar.gz
以下的步骤只需要在主节点(名称节点)上操作
使用hadoop登陆
[hadoop1@node1 ~]$ echo $HADOOP_HOME
/usr1/hadoop
将安装介质拷贝到如下的目录
[root@node1 hive]# cp hive-0.9.0.tar.gz /usr1/
解压
[root@node1 usr1]# tar -zxvf hive-0.9.0.tar.gz
目录改名
[root@node1 usr1]# ls
hadoop hive-0.9.0 hive-0.9.0.tar.gz
[root@node1 usr1]# mv hive-0.9.0 hive
将hive目录权限赋予hadoop用户
[root@node1 usr1]# chown -R hadoop1:hadoop1 ./hive
HIVE_HOME= /usr1/hive
修改后的红色标识
[hadoop1@node1 ~]$ more .bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
export JAVA_HOME=/usr/java/jdk1.8.0_05
export JRE_HOME=/usr/java/jdk1.8.0_05/jre
export CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
export HADOOP_HOME=/usr1/hadoop
HIVE_HOME=/usr1/hive
export PATH=$HADOOP_HOME/bin:$HIVE_HOME/bin:$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
PATH=$PATH:$HOME/bin
export PATH
该文件默认路径是/usr1/hive/bin/
编辑该文件,在文件最后添加相应环境变量
export JAVA_HOME=/usr/java/jdk1.8.0_05
export JRE_HOME=/usr/java/jdk1.8.0_05/jre
export CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
export HADOOP_HOME=/usr1/hadoop
HIVE_HOME=/usr1/hive
export PATH=$HADOOP_HOME/bin:$HIVE_HOME/bin:$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
PATH=$PATH:$HOME/bin
export PATH
在hive/con"目录下,没有这两个文件,只有一个"hive-default.xml.template",所以我们要复制两个"hive-default.xml.template",并分别命名为hive-default.xml和hive-site.xml
[hadoop1@node1 conf]$ cp hive-default.xml.template hive-default.xml
[hadoop1@node1 conf]$ cp hive-default.xml.template hive-site.xml
备注:hive-default.xml用于保留默认配置,hive-site.xml用于个性化配置,可覆盖默认配置.
使用hadoop用户登录
[hadoop1@node1 ~]$ hive
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
Logging initialized using configuration in jar:file:/usr1/hive/lib/hive-common-0.9.0.jar!/hive-log4j.properties
Hive history file=/tmp/hadoop1/hive_job_log_hadoop1_201410231338_539060526.txt
编辑hive-site.xml文件,在该文件中添加如下的信息
将mysql jdbc驱动拷贝到$HIVE_HOME/lib目录下
[hadoop1@node1 soft]$ cp mysql-connector-java-5.1.32-bin.jar /usr1/hive/lib
> create table x(a int);
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/x. Name node is in safe mode.
The reported blocks is only 16 but the threshold is 0.9990 and the total blocks 33. Safe mode will be turned off automatically.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2497)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2469)
at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:911)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
解决办法:
关闭安全模式
[hadoop1@node1 ~]$ hadoop dfsadmin -safemode leave
Warning: $HADOOP_HOME is deprecated.
Safe mode is OFF
-- The End --