我是spark的新手,想要配置SparkContext,但不幸的是我遇到了错误信息。
我写了这段代码:
from pyspark import SparkConf,SparkContext
from pyspark.streaming import StreamingContext
from pyspark.sql import Row,SQLContext
import sys
import requests
# create spark configuration
conf = SparkConf()
conf.setAppName("TwitterStreamApp")
# create spark context with the above configuration
sc = SparkContext(conf=conf)
我遇到了这个错误:
Py4JError Traceback (most recent call last)
<ipython-input-97-b0f526d72e5a> in <module>
1 # create spark context with the above configuration
----> 2 sc = SparkContext(conf=conf)
~\anaconda3\lib\site-packages\pyspark\context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
133 # If an error occurs, clean up in order to allow future SparkContext creation:
134 self.stop()
--> 135 raise
136
137 def _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
~\anaconda3\lib\site-packages\pyspark\context.py in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)
211 self.pythonVer = "%d.%d" % sys.version_info[:2]
212
--> 213 if sys.version_info < (3, 6):
214 with warnings.catch_warnings():
215 warnings.simplefilter("once")
~\anaconda3\lib\site-packages\py4j\java_gateway.py in __getattr__(self, name)
1528 answer, self._gateway_client, self._fqn, name)
1529 else:
-> 1530 raise Py4JError(
1531 "{0}.{1} does not exist in the JVM".format(self._fqn, name))
1532
Py4JError: org.apache.spark.api.python.PythonUtils.isEncryptionEnabled does not exist in the JVM
另外,在系统环境变量中,我添加了JAVA_HOME、SPARK_HOME等路径,但它们似乎无效。
pip
或conda
安装PySpark。这可能会有所帮助。 - Faraz Mazhar