脚踏实地、勇往直前!
全部博文(1005)
分类: HADOOP
2014-10-30 10:48:09
导出到本地系统
将hive表数据导出到os目录/home/hadoop1/file/from_hive,导出后会在tb_emp_info目录下生成文件名为000000_0的文件,好像不能指定具体的文件名导出
hive> insert overwrite local directory '/home/hadoop1/file/tb_emp_info' select * from tb_emp_info;
导出到hdfs系统
hive> insert overwrite directory '/home/hadoop1/hive_output/tb_emp_info' select * from tb_emp_info;
Total MapReduce jobs = 2
Launching Job 1 out of 2
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201410300915_0003, Tracking URL =
Kill Command = /usr1/hadoop/libexec/../bin/hadoop job -Dmapred.job.tracker= -kill job_201410300915_0003
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2014-10-30 10:35:42,580 Stage-1 map = 0%, reduce = 0%
2014-10-30 10:35:44,595 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 0.38 sec
2014-10-30 10:35:45,603 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 0.38 sec
MapReduce Total cumulative CPU time: 380 msec
Ended Job = job_201410300915_0003
Ended Job = -926438869, job is filtered out (removed at runtime).
Moving data to: hdfs://192.168.56.101:9000/tmp/hive-hadoop1/hive_2014-10-30_10-35-38_612_6377994640941387871/-ext-10000
Moving data to: /home/hadoop1/hive_output/tb_emp_info
Failed with exception Unable to rename: hdfs://192.168.56.101:9000/tmp/hive-hadoop1/hive_2014-10-30_10-35-38_612_6377994640941387871/-ext-10000 to: /home/hadoop1/hive_output/tb_emp_info
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
MapReduce Jobs Launched:
Job 0: Map: 1 Cumulative CPU: 0.38 sec HDFS Read: 768 HDFS Write: 428 SUCCESS
Total MapReduce CPU Time Spent: 380 msec
解决办法:
将hive.insert.into.multilevel.dirs参数的值由默认的false修改为true
-- The End --