全部博文(51)
分类: HADOOP
2017-11-09 17:36:44
本文档是以{user}=用户名,如autolog 。如果是其他账号替换成给定账号。
export JAVA_HOME=/opt/java/jdk
mv /opt/sohuhadoop.2.client /opt/sohuhadoop
把sohuhadoop owerner改成:
chown –R {user}:{user} sohuhadoop
export JAVA_HOME=/opt/java/jdk \\ export HADOOP_CONF_DIR=/opt/sohuhadoop/conf \\ export HADOOP_HOME=/opt/sohuhadoop/hadoop \\ export HIVE_HOME=/opt/sohuhadoop/hive \\ export CLASSPATH=.:$JAVA_HOME/lib/tools.jar \\ export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin \\ export JAVA_LIBRARY_PATH=/opt/sohuhadoop/hadoop/lib/native/Linux-amd64-64 \\
[libdefaults] default_realm = HADOOP.SOHU.COM dns_lookup_kdc = false dns_lookup_realm = false clockskew = 120 renewable = true [realms] HADOOP.SOHU.COM = { kdc = zw-hadoop-master:88 admin_server = zw-hadoop-master:749 } [domain_realm] [appdefaults] pam = { debug = false ticket_lifetime = 36000 renew_lifetime = 360000 forwardable = true krb4_convert = false renewable = true } kinit = { ticket_lifetime = 36000 renew_lifetime = 360000 forwardable = true }
xxx.xxx.xxx.xxx zw-hadoop-master. zw-hadoop-master
mkdir –p /pvdata/hadoopdata/tmp/hadoop-{user} chown –R {user}:{user} pvdata/hadoopdata/tmp/hadoop-{user}
9 */12 * * * /usr/kerberos/bin/kinit {user} -k -t /home/{user}/{user}.keytab
第一次需要手工执行
/usr/kerberos/bin/kinit {user} -k -t /home/{user}/{user}.keytab通过klist –e 来看ticket
查看自己可以目录
hadoop fs -ls /user/{user}往可以用目录上传文件,文件名位test
hadoop fs –put test /user/{user}/运行一下map/reduce 测试
cd /opt/sohuhadoop/hadoop bin/hadoop jar hadoop-examples-0.20.2-cdh3u1.jar wordcount aaaa /user/{user}/output
注意:需要客户端的时钟是标准时间