Hive数据怎么导入导出
本篇内容主要讲解“Hive数据怎么导入导出”,感兴趣的朋友不妨来看看。本文介绍的方法操作简单快捷,实用性强。下面就让小编来带大家学习“Hive数据怎么导入导出”吧!
10余年的互助网站建设经验,针对设计、前端、开发、售后、文案、推广等六对一服务,响应快,48小时及时工作处理。成都全网营销推广的优势是能够根据用户设备显示端的尺寸不同,自动调整互助建站的显示方式,使网站能够适用不同显示终端,在浏览器中调整网站的宽度,无论在任何一种浏览器上浏览网站,都能展现优雅布局与设计,从而大程度地提升浏览体验。成都创新互联从事“互助网站设计”,“互助网站推广”以来,每个客户项目都认真落实执行。
一、 从文件系统导入
数据源存放路径: /root/data
hive> load data local inpath "/root/data" overwrite intotable t1; Loading data to table default.t1Table default.t1 stats: [numFiles=1, numRows=0, totalSize=30,rawDataSize=0]OKTime taken: 1.712 secondshive> select * from t1;OKzhangsan 25lisi 27
wangwu 24
二、 从HDFS导入
Hdfs数据存放位置
[root@crxy177 ~]# hadoop dfs-ls /
-rw-r--r-- 1 root supergroup 30 2015-05-18 10:39 /data
hive> load data inpath"/data" overwrite into table t1; Loading data to table default.t1Moved:'hdfs://192.168.1.177:9000/user/hive/warehouse/t1/data' to trash at:hdfs://192.168.1.177:9000/user/root/.Trash/CurrentTable default.t1 stats: [numFiles=1,numRows=0, totalSize=30, rawDataSize=0]OKTime taken: 1.551 seconds三、 通过查询导入
创建一张表
hive> create table t2 like t1;
OK
Time taken: 0.246 seconds
导入数据
hive> insert overwrite table t2 select * form t1;
FAILED: NullPointerException null
hive> insert overwrite table t2 select * from t1;
Query ID = root_20150518104747_7922f9d4-2e15-434a-8b9f-076393d73470
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1431916152610_0001, Tracking URL = http://crxy177:8088/proxy/application_1431916152610_0001/
Kill Command = /usr/local/hadoop-2.6.0/bin/hadoop job -kill job_1431916152610_0001
Interrupting... Be patient, this might take some time.
Press Ctrl+C again to kill JVM
killing job with: job_1431916152610_0001
Hadoop job information for Stage-1: number of mappers: 0; number ofreducers: 0
2015-05-18 10:47:40,679 Stage-1 map = 0%, reduce = 0%
Ended Job = job_1431916152610_0001 with errors
Error during job, obtaining debugging information...
FAILED: Execution Error, return code 2 fromorg.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: HDFS Read: 0HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
四、 多表同时导入
创建t3,t4表
hive> createtable t3 like t1;
OK
Time taken:1.235 seconds
hive> createtable t4 like t1;
OK
Time taken:0.211 seconds
多表数据导入
hive> FROM t1
> INSERT OVERWRITE TABLE t2 SELECT * WHERE 1=1
> INSERT OVERWRITE TABLE t3 SELECT * WHERE 1=1
> INSERT OVERWRITE TABLE t4 SELECT * WHERE 1=1;
Query ID =root_20150518105252_9101659d-0990-4626-a4f7-8bad768af48b
Total jobs = 7
Launching Job 1out of 7
Number of reducetasks is set to 0 since there's no reduce operator
Starting Job =job_1431916152610_0002, Tracking URL = http://crxy177:8088/proxy/application_1431916152610_0002/
Kill Command =/usr/local/hadoop-2.6.0/bin/hadoop job -kill job_1431916152610_0002
Hadoop jobinformation for Stage-3: number of mappers: 1; number of reducers: 0
2015-05-1810:52:50,866 Stage-3 map = 0%, reduce =0%
2015-05-1810:53:02,273 Stage-3 map = 100%, reduce= 0%, Cumulative CPU 1.41 sec
MapReduce Totalcumulative CPU time: 1 seconds 410 msec
Ended Job =job_1431916152610_0002
Stage-6 isselected by condition resolver.
Stage-5 isfiltered out by condition resolver.
Stage-7 isfiltered out by condition resolver.
Stage-12 isselected by condition resolver.
Stage-11 isfiltered out by condition resolver.
Stage-13 isfiltered out by condition resolver.
Stage-18 isselected by condition resolver.
Stage-17 isfiltered out by condition resolver.
Stage-19 isfiltered out by condition resolver.
Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10000
Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10002
Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10004
Loading data totable default.t2
Loading data totable default.t3
Loading data totable default.t4
Table default.t2stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]
Table default.t3stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]
Table default.t4stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]
MapReduce JobsLaunched:
Stage-Stage-3:Map: 1 Cumulative CPU: 1.41 sec HDFS Read: 237 HDFS Write: 288 SUCCESS
Total MapReduceCPU Time Spent: 1 seconds 410 msec
OK
Time taken:34.245 seconds
到此,相信大家对“Hive数据怎么导入导出”有了更深的了解,不妨来实际操作一番吧!这里是创新互联网站,更多相关内容可以进入相关频道进行查询,关注我们,继续学习!
网页名称:Hive数据怎么导入导出
分享URL:http://ybzwz.com/article/gdieei.html