pyspark執行可能就遇到問題
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[])
其實蠻好解決的將原來的
from pyspark import SparkContext
from pyspark import SparkConf
改寫爲
from pyspark import SparkContext
try:
sc.stop()
except:
pass
from pyspark import SparkConf
就ok了。