spark環境idea與sbt的配置

sbt下載官網: https://www.scala-sbt.org/download.html,我下載的是msi安裝包,默認安裝改個文件夾就行

sbt默認源基本是連不上的,安裝完Scala,idea(idea要裝scala插件)和sbt後,要在sbt文件夾和idea設置中進行配置

本機安裝sbt路徑爲D://Client/Spark/sbt,紅色字體是需要根據自己配置修改的內容

安裝完sbt後進入 D://Client/Spark/sbt/conf 文件夾,修改/新建文件 repo.properties

[repositories]
  local
  aliyun: http://maven.aliyun.com/nexus/content/groups/public/
  typesafe: http://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly
  sonatype-oss-releases
  maven-central
  sonatype-oss-snapshots

之後修改文件 sbtconfig.txt 爲

# Set the java args to high
-Xmx512M
-XX:MaxPermSize=256m
-XX:ReservedCodeCacheSize=128m
# Set the extra SBT options
-Dsbt.log.format=true
-Dsbt.boot.directory=
D://Client/Spark/sbt/data/boot/
-Dsbt.global.base=
D://Client/Spark/sbt/data/.sbt
-Dsbt.ivy.home=
D://Client/Spark/sbt/data/.ivy2
-Dsbt.repository.config=
D://Client/Spark/sbt/conf/repo.properties
-Dsbt.repository.secure=false
 
然後在 D://Client/Spark/sbt/\bin\sbt-launch.jar 中的 \sbt\sbt.boot.properties中(可用rar解壓工具直接打開修改並覆蓋,記住是用rar打開文件不需要解壓,否則弄不回jar了),修改裏面的內容爲:

[scala]
  version: ${sbt.scala.version-
2.10.5}

[app]
  org: ${sbt.organization-org.scala-sbt}
  name: sbt
  version: ${sbt.version-
1.3.3}
  class: ${sbt.main.class-sbt.xMain}
  components: xsbti,extra
  cross-versioned: ${sbt.cross.versioned-false}
  resources: ${sbt.extraClasspath-}

[repositories]
  local
  spring: http://conjars.org/repo/
  cloudera: https://repository.cloudera.com/artifactory/cloudera-repos/
  aliyun: http://maven.aliyun.com/nexus/content/groups/public/
  maven-central
  sbt-maven-releases: https://repo.scala-sbt.org/scalasbt/maven-releases/, bootOnly
  sbt-maven-snapshots: https://repo.scala-sbt.org/scalasbt/maven-snapshots/, bootOnly
  typesafe-ivy-releases: https://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly
  sbt-ivy-snapshots: https://repo.scala-sbt.org/scalasbt/ivy-snapshots/, [organization]/[module]/[revision]/[type]s/[artifact](-[classifier]).[ext], bootOnly

[boot]
  directory: ${sbt.boot.directory-${sbt.global.base-${user.home}/.sbt}/boot/}
  lock: ${sbt.boot.lock-true}

[ivy]
  ivy-home:
D://Client/Spark/sbt/data/.ivy2
  checksums: ${sbt.checksums-sha1,md5}
  override-build-repos: ${sbt.override.build.repos-false}
  repository-config: ${sbt.repository.config-${sbt.global.base-${user.home}/.sbt}/repositories}


配置完後,在cmd中運行一次sbt初始化,初始化過程有點慢,不過可以通過觀察D://Client/Spark/sbt/data/.ivy2發現並沒有卡住,而是真的慢。。

複製sbtconfig內容到idea中setting->buildtools->sbt的vm parameters 並修改launcher爲..../sbt/bin/sbt-launch(從安裝的sbt啓動),並勾選Settings中的use sbt shell for build and import 

idea引入assembly時注意要對應sbt版本

 

 

IDEA中安裝sbt的一些依賴,文件名build.sbt ,放在項目裏打開後IDEA會自動檢測並安裝:

name := "iptv1"

version := "1.0"

scalaVersion := "2.10.5"
scalacOptions += "-deprecation"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.6.0-cdh5.7.2" excludeAll ExclusionRule(organization = "javax.servlet")
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.7.2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.7.2" excludeAll ExclusionRule(organization = "javax.servlet")
libraryDependencies += "org.apache.hbase" % "hbase-protocol" % "1.2.0-cdh5.7.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.38"
libraryDependencies += "com.yammer.metrics" % "metrics-core" % "2.2.0"


//libraryDependencies += "org.apache.hadoop" % "hadoop-core" % "2.6.0-mr1-cdh5.7.2"
// libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.7.2"
// libraryDependencies += "org.apache.hbase" % "hbase-mapreduce" % "1.2.0-cdh5.7.2"

// libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
// libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2"
//----------------------------CDH 5.13.1----------------------------//
//name := "demo_sbt2"

//version := "0.1"

//scalaVersion := "2.11.8"

//libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.spark" % "spark-hive_2.11" % "2.3.0.cloudera2"
//libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.13.1"
//libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.13.1"

//--------------------------------------------------------------------//

 

 

 

 

 

 

 

 

 

 

 

 


 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章