spark Caused by: java.lang.ClassNotFoundException: libsvm.DefaultSource

今天學習spark-mlib時報錯說找不到libsvm.DefaultSource
詳情如下:

Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: libsvm. Please find packages at http://spark.apache.org/third-party-projects.html
	at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:549)
	at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
	at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:301)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:156)
	at cn.itcast.czxy.BD18.Iris$.main(Iris.scala:19)
	at cn.itcast.czxy.BD18.Iris.main(Iris.scala)
Caused by: java.lang.ClassNotFoundException: libsvm.DefaultSource
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$21$$anonfun$apply$12.apply(DataSource.scala:533)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$21$$anonfun$apply$12.apply(DataSource.scala:533)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$21.apply(DataSource.scala:533)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$21.apply(DataSource.scala:533)
	at scala.util.Try.orElse(Try.scala:84)
	at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
	... 7 more

問題出現的原因

  1. 少了 spark-mllib 的依賴
		<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>
  1. 最初在google中搜索到說是少了包,然後我發現該pom中添加了 <scope>provided</scope> ,結果還是找不到,仔細檢查後纔看到,服氣…
   	<dependency>
           <groupId>org.apache.spark</groupId>
           <artifactId>spark-mllib_2.11</artifactId>
           <version>${spark.version}</version>
           <!-- 注意這一行 -->
           <!--<scope>provided</scope>-->
     </dependency>
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章