时间:2021-07-01 10:21:17 帮助过:27人阅读
现在需要将数据表YHD_CATEG_PRIOR导入到Hive中。
脚本如下:
# 创建Hive数据表pms.yhd_categ_prior_user
hive -e "
set mapred.job.queue.name=pms;
set mapred.job.name=[CIS]yhd_categ_prior_user;
-- Hive DDL
DROP TABLE IF EXISTS pms.yhd_categ_prior_user;
CREATE TABLE pms.yhd_categ_prior_user
(
category_id bigint,
category_name string,
category_level int,
default_import_categ_prior int,
user_import_categ_prior int,
default_eliminate_categ_prior int,
user_eliminate_categ_prior int,
update_time string
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘\t‘
LINES TERMINATED BY ‘\n‘
STORED AS TEXTFILE;"
# 同步mysql的market.YHD_CATEG_PRIOR到hive中
hadoop fs -rmr /user/pms/YHD_CATEG_PRIOR
sqoop import -Dmapred.job.queue.name=pms --connect jdbc:mysql://127.0.0.1:3306/market \
--username admin \
--password 123456 \
--table YHD_CATEG_PRIOR \
--hive-table pms.yhd_categ_prior_user \
--fields-terminated-by ‘\t‘ \
--lines-terminated-by ‘\n‘ \
--hive-overwrite \
--hive-drop-import-delims \
--hive-import
上述的脚本工作流程:
col_name data_type comment
# col_name data_type comment
category_id bigint None
category_name string None
category_level int None
default_import_categ_prior int None
user_import_categ_prior int None
default_eliminate_categ_prior int None
user_eliminate_categ_prior int None
update_time string None
# Detailed Table Information
Database: pms
Owner: pms
CreateTime: Fri Jun 05 18:48:01 CST 2015
LastAccessTime: UNKNOWN
Protect Mode: None
Retention: 0
Location: hdfs://yhd-jqhadoop2.int.yihaodian.com:8020/user/hive/pms/yhd_categ_prior_user
Table Type: MANAGED_TABLE
Table Parameters:
numFiles 5
numPartitions 0
numRows 0
rawDataSize 0
totalSize 447779
transient_lastDdlTime 1433501435
# Storage Information
SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
InputFormat: org.apache.hadoop.mapred.TextInputFormat
OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
Compressed: No
Num Buckets: -1
Bucket Columns: []
Sort Columns: []
Storage Desc Params:
field.delim \t
line.delim \n
serialization.format \t
[Sqoop]将Mysql数据表导入到Hive
标签: