Spark由淺到深(1)--安裝,測試,問題排錯

安裝部署

// 選擇需要的版本.
官網下載: http://spark.apache.org/downloads.html

// 部署
tar -zxf spark-1.4.0-bin-hadoop2.6.tgz
cd spark-1.4.0-bin-hadoop2.6

// 執行SparkShell, 這裏使用Python的.
bin/pyspark

問題&排錯

錯誤1: “…….Name or service not known”

[GCC 4.4.7 20120313 (Red Hat 4.4.7-18)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/20 15:13:11 INFO SparkContext: Running Spark version 1.4.0
17/10/20 15:13:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/20 15:13:12 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: Gee01: Gee01: Name or service not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:821)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:814)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:814)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
	at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:871)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:387)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: Gee01: Name or service not known
    at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
    ... 20 more
17/10/20 15:13:12 INFO SparkContext: Successfully stopped SparkContext
Traceback (most recent call last):
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/shell.py", line 43, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 113, in __init__
    conf, jsc, profiler_cls)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 165, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/pyspark/context.py", line 219, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 701, in __call__
  File "/home/django/software/spark-1.4.0-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: Gee01: Gee01: Name or service not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:821)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:814)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:814)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
	at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:871)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:871)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:387)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: Gee01: Name or service not known
    at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
    ... 20 more

原因: 不能解析hostname
解決:

// 查看個人主機hostname, 我的主機hostname是: Gee01
hostname

// 修改hosts文件並保存
vim /etc/hosts

// 確保有這一行,後面和hostname一致.
127.0.0.1 localhost Gee01

在此執行Shell, 就可以正常進入.

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 1.4.0
      /_/

Using Python version 2.7.13 (default, Aug 30 2017 11:49:17)
SparkContext available as sc, HiveContext available as sqlContext.

問題2: 日誌輸出太多了, 可不可以減少?

描述:
輸入一行命令後, 輸出日誌太"豐富"了
原因: 日誌級別的設置
解決:

// 進入配置目錄
cd spark-1.4.0-bin-hadoop2.6/conf

// 複製一份日誌配模板爲:log4j.properties 
cp log4j.properties.template log4j.properties

// 編輯 日誌配置文件
vim log4j.properties

// 修改第一行 
log4j.rootCategory=INFO, console # 修改前
log4j.rootCategory=WARN, console # 修改後

重新進入Shell, 會發現日誌輸出信息減少.
這裏寫圖片描述

退出Shell, 輸入”quit()”或者 C-d(Ctrl+d).

問題3: PyShell下沒有Tab不全, 好痛苦. 有沒有解決方案?

解決方法:

// 命令行輸入並再次進入即可.
IPYTHON=1 ./bin/pyspark
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章