通过 pip install pyspark
进行pyspark
的全新安装后,我遇到了以下错误:
> pyspark
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/bin']
/usr/local/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/bin/pyspark: line 77: /bin/spark-submit: No such file or directory
/usr/local/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
> spark-shell
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/bin']
/usr/local/bin/spark-shell: line 57: /bin/spark-submit: No such file or directory
什么是有效的
SPARK_HOME
,如何设置它,为什么没有默认值可用?我曾看到关于在手动安装Spark后手动设置环境变量的说明,但我想知道如何在使用
pip
安装pyspark
后进行设置。我只安装了Spark(通过
brew install apache-spark
),而通过这种安装得到的spark-shell
可以直接使用。在安装pyspark之后,我收到了上面的消息。很困惑。