spark streaming和spark SQL整合使用出現的問題

問題描述:

在spark streaming的foreachrdd中使用spark session獲取hive中的數據時,只會顯示一個default庫

解決方法:

1、在resources中放入集羣中的core-site.xml、hdfs-site.xml、hive-site.xml
2、修改代碼
之前的代碼:

@transient        
val sparkConf = new SparkConf()
        .setAppName("REPORT_SYSTEM")
        .setMaster("local[*]")
        .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
        //反壓
        .set("spark.streaming.backpressure.enabled", "true")
        //限制每次作業中每個 Kafka 分區最多讀取的記錄條數
        .set("spark.streaming.kafka.maxRatePerPartition", "100")
        
val ssc = new StreamingContext(sparkConf, Seconds(60))
...
DStream.foreachRDD(rdd => {
    if (!rdd.isEmpty()) {
        SQLContextSingleton.getInstance(rdd.sparkContext.getConf).sql("show databases").show()
    }
})
...
ssc.start()
ssc.awaitTermination()



object SQLContextSingleton {
    @transient
    private var instance: SparkSession = _
    def getInstance(sparkConf: SparkConf): SparkSession = {
        if (instance == null) {
            instance = SparkSession.builder()
                    .config(sparkConf)
                    .config("hive.metastore.uris", "thrift://localhost:9083")
                    .config("spark.sql.warehouse.dir", GlobalConfigUtil.hdfsHosts + "/user/hive2/warehouse")
                    .config("hive.metastore.warehouse.dir", GlobalConfigUtil.hdfsHosts + "/user/hive2/warehouse")
                    .enableHiveSupport()
                    .getOrCreate()
        }
        instance
    }
}

修改之後的代碼:

@transient        
val sparkConf = new SparkConf()
        .setAppName("REPORT_SYSTEM")
        .setMaster("local[*]")
        .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
        //反壓
        .set("spark.streaming.backpressure.enabled", "true")
        //限制每次作業中每個 Kafka 分區最多讀取的記錄條數
        .set("spark.streaming.kafka.maxRatePerPartition", "100")

val spark: SparkSession = SparkSession.builder()
        .enableHiveSupport()
        .config(sparkConf)
        .config("hive.metastore.uris", "thrift://localhost:9083")
        .getOrCreate()

@transientval
sc = spark.sparkContext
sc.setLogLevel("WARN")
val ssc = new StreamingContext(sc, Seconds(60))

...
DStream.foreachRDD(rdd => {
    if (!rdd.isEmpty()) {
        spark.sql("show databases").show()
    }
})
...
ssc.start()
ssc.awaitTermination()
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章