我正在使用EMR上的Spark并编写一个Pyspark脚本,当我尝试时出现了错误。
from pyspark import SparkContext
sc = SparkContext()
这是错误信息。File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM
我发现这个答案说我需要导入sparkcontext但这也不起作用。
print(conf)
,会得到什么结果? - pvy4917sc = SparkContext(conf)
- pvy4917