我是一个刚入门的Spark用户,想要从命令行中运行Python脚本。我已经在交互式环境下测试了pyspark并且它可以正常工作。但是当我尝试创建sc时,出现了以下错误:
File "test.py", line 10, in <module>
conf=(SparkConf().setMaster('local').setAppName('a').setSparkHome('/home/dirk/spark-1.4.1-bin-hadoop2.6/bin'))
File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/conf.py", line 104, in __init__
SparkContext._ensure_initialized()
File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/context.py", line 229, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/java_gateway.py", line 48, in launch_gateway
SPARK_HOME = os.environ["SPARK_HOME"]
File "/usr/lib/python2.7/UserDict.py", line 23, in __getitem__
raise KeyError(key)
KeyError: 'SPARK_HOME'