• 【原创】大叔经验分享(65)spark读取不到hive表


    spark 2.4.3

    spark读取hive表,步骤:

    1)hive-site.xml

    hive-site.xml放到$SPARK_HOME/conf下

    2)enableHiveSupport

    SparkSession.builder.enableHiveSupport().getOrCreate()

    3) 测试代码

        val sparkConf = new SparkConf().setAppName(getName)
        val sc = new SparkContext(sparkConf)
        val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()
        spark.sql("show databases").rdd.foreach(println)

    使用$SPARK_HOME/bin/spark-submit提交任务后发现并不能读取到hive的数据库,相关日志如下

    19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
    19/05/31 13:11:31 INFO SharedState: loading hive config file: file:/export/spark-2.4.3-bin-hadoop2.6/conf/hive-site.xml
    19/05/31 13:11:31 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
    19/05/31 13:11:31 INFO SharedState: Warehouse path is '/user/hive/warehouse'.
    19/05/31 13:11:31 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoin

    说明已经读到hive-site.xml;

    进一步测试,使用$SPARK_HOME/bin/spark-sql或者$SPARK_HOME/bin/spark-shell发现都可以读到hive数据库,很神奇有没有,

    $SPARK_HOME/bin/spark-shell启动的类为org.apache.spark.repl.Main

    "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"

    跟进org.apache.spark.repl.Main代码

    ...
          val builder = SparkSession.builder.config(conf)
          if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
            if (SparkSession.hiveClassesArePresent) {
              // In the case that the property is not set at all, builder's config
              // does not have this value set to 'hive' yet. The original default
              // behavior is that when there are hive classes, we use hive catalog.
              sparkSession = builder.enableHiveSupport().getOrCreate()
              logInfo("Created Spark session with Hive support")
            } else {
              // Need to change it back to 'in-memory' if no hive classes are found
              // in the case that the property is set to hive in spark-defaults.conf
              builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
              sparkSession = builder.getOrCreate()
              logInfo("Created Spark session")
            }
          } else {
            // In the case that the property is set but not to 'hive', the internal
            // default is 'in-memory'. So the sparkSession will use in-memory catalog.
            sparkSession = builder.getOrCreate()
            logInfo("Created Spark session")
          }
          sparkContext = sparkSession.sparkContext
          sparkSession
    ...

    发现和测试代码有些差异,关键是在倒数第二行,这里是先创建SparkSession,再从SparkSession中获取SparkContext,另外注意到之前有个WARN级别的日志

    19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.

    修改测试代码

        val sparkConf = new SparkConf().setAppName(getName)
        //val sc = new SparkContext(sparkConf)
        val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()
        val sc = spark.sparkContext
        spark.sql("show databases").rdd.foreach(println)

    这次果然ok了,详细原因有空再看,未完待续;

  • 相关阅读:
    Windows XP上可以安装的SQL Server 2008版本
  • 原文地址:https://www.cnblogs.com/barneywill/p/10959418.html
Copyright © 2020-2023  润新知