Hadoop和Java的版本要一一對應,否則到時候會出現不支持的函數出現。下面這個是官方給出來的對應信息。
Java 版本支持
- Apache Hadoop 3.x 版本 現在只支持 Java 8
- Apache Hadoop 從2.7.x 到 2.x 版本支持Java 7 and 8
- Apache Hadoop 最新更新的2.7版本需要Java 7. 它在 OpenJDK 和Oracle (HotSpot)'s JDK/JRE下都已經通過編譯和檢測,早期的版本 (2.6 或更早版本) 則支持Java 6.
- Java 11 現階段部分支持:
- trunk (3.3.0-SNAPSHOT) 支持 Java 11 運行時: HADOOP-15338 - Java 11 運行時支持RESOLVED
- Hadoop在使用 Java 11編譯時不支持: HADOOP-16795 - Java 11 編譯時支持 OPEN
JDKs/JVMs支持
- 現Apache Hadoop 社區用 OpenJDK 來作爲 build/test/release 的環境, 下面是 OpenJDK 爲什麼得到社區支持的原因 .
- 其他的jdks/jvms 應該也蠻好使的. 如果你覺得他們不怎麼好使, 請查找 JIRA相關文檔.
通過檢驗的JDK
下表是已經已知 在使用的 或已經通過檢驗的JDKs 版本:
版本號 |
狀態 |
報告來源 |
oracle 1.7.0_15 |
Good |
Cloudera |
oracle 1.7.0_21 |
Good (4) |
Hortonworks |
oracle 1.7.0_45 |
Good |
Pivotal |
openjdk 1.7.0_09-icedtea |
Good (5) |
Hortonworks |
oracle 1.6.0_16 |
Avoid (1) |
Cloudera |
oracle 1.6.0_18 |
Avoid |
Many |
oracle 1.6.0_19 |
Avoid |
Many |
oracle 1.6.0_20 |
Good (2) |
LinkedIn, Cloudera |
oracle 1.6.0_21 |
Good (2) |
Yahoo!, Cloudera |
oracle 1.6.0_24 |
Good |
Cloudera |
oracle 1.6.0_26 |
Good(2) |
Hortonworks, Cloudera |
oracle 1.6.0_28 |
Good |
|
oracle 1.6.0_31 |
Good(3, 4) |
Cloudera, Hortonworks |
如果是新項目的話,在沒有項目包袱的情況下,建議儘量選用Hadoop3.X以上的版本會更加適合。如果是老項目的話,想進行項目升級的話,要注意了,因爲Java版本的一些變化導致代碼有些不兼容。
下面是來自官網的JDK1.8版本修改導致的不兼容問題。
Java 不兼容的變更
下面表格文檔供升級的Hadoop集羣的Java版本的用戶使用 . 它記錄了影響Apache Hadoop的Java的變更.
Java 8
版本號 |
不兼容的變更 |
相關JDK bug系統 票證 |
關聯的JIRAs |
---|---|---|---|
1.8.0_242 |
The visibility of sun.nio.ch.SocketAdaptor is changed from public to package-private. TestIPC#testRTEDuringConnectionSetup is affected. |
JDK-8237177 |
HADOOP-15787 - [JDK11] TestIPC.testRTEDuringConnectionSetup fails RESOLVED |
1.8.0_242 |
Kerberos Java client will fail by "Message stream modified (41)" when the client requests a renewable ticket and the KDC returns a non-renewable ticket. If your principal is not allowed to obtain a renewable ticket, you must remove "renew_lifetime" setting from your krb5.conf. |
JDK-8131051 | |
1.8.0_171 |
In Apache Hadoop 2.7.0 to 2.7.6, 2.8.0 to 2.8.4, 2.9.0 to 2.9.1, 3.0.0 to 3.0.2, and 3.1.0, KMS fails by java.security. UnrecoverableKeyException due to Enhanced KeyStore Mechanisms. You need to set the system property "jceks.key.serialFilter" to the following value to avoid this error: java.lang.Enum;java.security.KeyRep;java.security. KeyRep$Type;javax.crypto.spec.SecretKeySpec; org.apache.hadoop.crypto.key.JavaKeyStoreProvider$KeyMetadata;!*" |
HADOOP-15473 - Configure serialFilter in KeyProvider to avoid UnrecoverableKeyException caused by JDK-8189997 RESOLVED |
|
1.8.0_191 |
All DES cipher suites were disabled. If you are explicitly using DEC cipher suites, you need to change cipher suite to a strong one. |
HADOOP-16016 - TestSSLFactory#testServerWeakCiphers sporadically fails in precommit builds RESOLVED |