在運行Spark程序時,出現如下錯誤:
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/company/bi/spark/UserInfoToHbase : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:639)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:392)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:252)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:774)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:772)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
當看到Unsupported major.minor version 52.0錯誤信息時,就可以確定是由於JDK版本低於1.8導致的。即編譯時使用了JDK1.8,但是運行環境中的JDK版本低於1.8導致的。
我檢查了集羣中所有的主機,
[[email protected] spark_job_file]# java -version
java version "1.8.0_191"
Java(TM) SE Runtime Environment (build 1.8.0_191-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.191-b12, mixed mode)
發現所有的主機的JDK都是1.8。
經過檢查,發現部分主機的環境變量中沒有JAVA_HOME,於是有了以下兩個方案:
方案1:由於我的運行模式爲cluster,我在spark2-submit中添加了兩個配置項,
--conf "spark.executorEnv.JAVA_HOME=/usr/java/jdk1.8.0_191-amd64"
--conf "spark.yarn.appMasterEnv.JAVA_HOME=/usr/java/jdk1.8.0_191-amd64"
強制指定了JDK的路徑,但是這有個前提就是spark所有節點的上的java_home必須一致,爲了保證這一點,在安裝JDK的時候可以使用 rpm包安裝,這樣默認路徑就是一致的。
方案2:給所有的主機都配上JAVA_HOME,並指定到JDK1.8的安裝路徑