Friday, 28 December 2018

MySQL to Hive using SQOOP import-all-tables

[cloudera@quickstart ~]$ sqoop import-all-tables --num-mappers 1 --connect jdbc:mysql://localhost/world --username root --password cloudera --hive-import --hive-overwrite  --create-hive-table --compress  --compression-codec org.apache.hadoop.io.compress.SnappyCodec --outdir java_files;
chgrp: changing ownership of 'hdfs://quickstart.cloudera:8020/user/hive/warehouse/city': User does not belong to supergroup

chgrp: changing ownership of 'hdfs://quickstart.cloudera:8020/user/hive/warehouse/country': User does not belong to supergroup


Loading data to table default.city
Loading data to table default.country


hive> show tables;
OK
city
country
Time taken: 0.013 seconds, Fetched: 2 row(s)

hive> select * from city limit 10;
OK
1 Kabul AFG Kabol 1780000
2 Qandahar AFG Qandahar 237500
3 Herat AFG Herat 186800
4 Mazar-e-Sharif AFG Balkh 127800
5 Amsterdam NLD Noord-Holland 731200
6 Rotterdam NLD Zuid-Holland 593321
7 Haag NLD Zuid-Holland 440900
8 Utrecht NLD Utrecht 234323
9 Eindhoven NLD Noord-Brabant 201843
10 Tilburg NLD Noord-Brabant 193238
Time taken: 0.614 seconds, Fetched: 10 row(s)


describe formatted city;
Location:            hdfs://quickstart.cloudera:8020/user/hive/warehouse/city

[cloudera@quickstart ~]$ hdfs dfs -ls /user/hive/warehouse/city
Found 2 items
-rwxrwxrwx   1 cloudera cloudera          0 2018-12-28 06:31 /user/hive/warehouse/city/_SUCCESS
-rwxrwxrwx   1 cloudera cloudera      93338 2018-12-28 06:31 /user/hive/warehouse/city/part-m-00000.snappy


No comments:

Post a Comment

Flume - Simple Demo

// create a folder in hdfs : $ hdfs dfs -mkdir /user/flumeExa // Create a shell script which generates : Hadoop in real world <n>...