IDEA中Spark讀寫Hive測試

 

1.將集羣上的hive-site.xml文件拷貝到maven工程的resource目錄下

並新增下面代碼即可

<property>

  <name>hive.metastore.uris</name>

  <value>thrift://hadoop000:9083</value>

</property>

參考官方文檔:    所以不用添加我上圖中hive-site.xml文件註釋的 hive.metastore.warehouse.dir 

2.在集羣上啓動HIVE服務,不啓動會報錯

Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

啓動命令

hive --service metastore

3.IDEA執行

讀取結果

對比集羣上hive執行的結果,是一樣的!

如果還有問題參考pom.xml文件

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.imooc.bigdata</groupId>
    <artifactId>sparksql-train</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.11</scala.tools.version>
        <scala.version>2.11.8</scala.version>
        <spark.version>2.4.3</spark.version>
        <hadoop.version>2.6.0-cdh5.15.1</hadoop.version>
    </properties>

    <!--引入CDH的倉庫-->
    <repositories>
        <repository>
            <id>cloudera</id>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos</url>
        </repository>
    </repositories>
<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!--Spark SQL依賴-->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>

    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive-thriftserver_2.11</artifactId>
    <version>${spark.version}</version>
    </dependency>
    <!-- Hadoop相關依賴-->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>${hadoop.version}</version>
    </dependency>

    <dependency>
        <groupId>mysql</groupId>
        <artifactId>mysql-connector-java</artifactId>
        <version>5.1.47</version>
    </dependency>

    <dependency>
        <groupId>com.typesafe</groupId>
        <artifactId>config</artifactId>
        <version>1.3.3</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-jdbc</artifactId>
        <version>1.2.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.kudu</groupId>
        <artifactId>kudu-client</artifactId>
        <version>1.7.0</version>
    </dependency>

    <dependency>
        <groupId>org.apache.kudu</groupId>
        <artifactId>kudu-spark2_2.11</artifactId>
        <version>1.7.0</version>
    </dependency>

    <!-- Test -->
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.11</version>
        <scope>test</scope>
    </dependency>
</dependencies>

</project>

 首先把測試環境的/user/hive/warehouse權限開啓  hadoop fs -chmod -R 777 /user/hive/warehouse

在前面的基礎上加上讀取mysql寫入hive的代碼

//讀取MYSQL數據,寫入HIVE
    val config = ConfigFactory.load()
    val url = config.getString("db.default.url")
    val user = config.getString("db.default.user")
    val password = config.getString("db.default.password")
    val connectionProperties = new Properties()
    connectionProperties.put("user", user)
    connectionProperties.put("password", password)
    val jdbcDF2 = spark.read.jdbc(url,  "logins",connectionProperties)

    jdbcDF2.write.saveAsTable("mysql_stat_hive")

執行日誌

在集羣中hive查看錶信息,可以看到已經插入進去了 

在HDFS上也可以看到,是本地電腦上傳上去的。真是項目中一半不允許這種操作,測試可以試試

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章