spark加載properties配置文件方法

首先我先介紹一下錯誤的寫法,這個加載配置文件方法是錯誤的:

    val props = new Properties();
    val loader = getClass.getClassLoader;
    props.load(new FileInputStream(loader.getResource("config.properties").getFile()))
這個是把配置文件直接放在resource的目錄下,去獲得配置文件信息,這個寫法在spark程序中會報找不到配置文件。

正確寫法:

    val props = new Properties();

    props.load(new FileInputStream("config.properties"));
    val hdfspath = props.getProperty("hdfspath");
    val mysqlpath = props.getProperty("mysql");

你可以在這些地方加載配置文件

1.

 kafkaStream.foreachRDD { rdd =>
            rdd.foreachPartition { partition =>

                val filePath = "config.properties"
                LogUtil.info(filePath)
                val props = new Properties()
                props.load(new FileInputStream(filePath))

                LogUtil.info("一")
                props.keySet().toArray().foreach { x =>
                    LogUtil.info(x + "\t一" + props.getProperty(x.toString()))
                }

            
2.    partition.foreach { x =>
                    LogUtil.info(x)

                    val filePath1 = "config.properties"
                    LogUtil.info(filePath1)
                    val props1 = new Properties()
                    props1.load(new FileInputStream(filePath1))

                    LogUtil.info("二")
                    props1.keySet().toArray().foreach { x =>
                        LogUtil.info(x + "\t二" + props1.getProperty(x.toString()))
                    }

                }
3.

 def main(args: Array[String]): Unit = {

        var kafkaZkQuorum = ""
        var group = "EventETL_test_group"
        var topics = ""
        var numThreads = 1
        var timeDuration = 3

        var checkpointDir = "/Users/test/sparktemp"

        println("Usage: configuration file")

        val filePath = "config.properties"
        LogUtil.info(filePath)
        val props = new Properties()
        props.load(new FileInputStream(filePath))





發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章