当前位置:Gxlcms > 数据库问题 > 记录一下window idea 如何直连sparksql 使用hive数据元数据

记录一下window idea 如何直连sparksql 使用hive数据元数据

时间:2021-07-01 10:21:17 帮助过:66人阅读

  我就直接贴了,把hive的元数据库改成latin1字符集,记住库和里面所有表

 1 alter database hive character set latin1;  
 2 
 3  alter table BUCKETING_COLS                 convert to character set latin1;
 4  alter table CDS                            convert to character set latin1;
 5  alter table COLUMNS_V2                     convert to character set latin1;
 6  alter table DATABASE_PARAMS                convert to character set latin1;
 7  alter table DBS                            convert to character set latin1;
 8  alter table FUNC_RU                        convert to character set latin1;
 9  alter table FUNCS                          convert to character set latin1;
10  alter table GLOBAL_PRIVS                   convert to character set latin1;
11  alter table PART_COL_STATS                 convert to character set latin1;
12  alter table PARTITION_KEY_VALS             convert to character set latin1;
13  alter table PARTITIONS                     convert to character set latin1;
14  alter table ROLES                          convert to character set latin1;
15  alter table SDS                            convert to character set latin1;
16  alter table SEQUENCE_TABLE                 convert to character set latin1;
17  alter table SERDES                         convert to character set latin1;
18  alter table SKEWED_STRING_LIST             convert to character set latin1;
19  alter table SKEWED_STRING_LIST_VALUES      convert to character set latin1;
20  alter table SORT_COLS                      convert to character set latin1;
21  alter table TAB_COL_STATS                  convert to character set latin1;
22  alter table TBLS                           convert to character set latin1;
23  alter table VERSION                        convert to character set latin1;

 4.测试

 1 object test {
 2     def main(args: Array[String]): Unit = {
 3 
 4       val conf = new SparkConf()
 5       conf.setAppName(s"TestHive")
 6       conf.setMaster("local[4]")
 7       val spark = SparkSession.builder.config(conf).enableHiveSupport().getOrCreate()
 8 
 9       spark.sql("show databases").show
10     }
11 
12 }

技术图片

 

 

  

记录一下window idea 如何直连sparksql 使用hive数据元数据

标签:环境   res   column   lob   开发环境   info   roles   配置   ide   

人气教程排行