无法将作业提交到Spark集群(集群模式)

7

Spark版本1.3.0

在集群模式下向Spark集群提交作业时出错

 ./spark-submit --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount --deploy-mode cluster wordcount-0.1.jar 172.20.5.174:9092,172.20.9.50:9092,172.20.7.135:9092 log

产出:

Spark assembly has been built with Hive, including Datanucleus jars on classpath
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/14 16:41:10 INFO StandaloneRestClient: Submitting a request to launch an application in spark://172.20.9.151:7077.
Warning: Master endpoint spark://172.20.9.151:7077 was not a REST server. Falling back to legacy submission gateway instead.
15/04/14 16:41:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Sending launch command to spark://172.20.9.151:7077
Error connecting to master spark://172.20.9.151:7077 (akka.tcp://sparkMaster@172.20.9.151:7077), exiting.
1个回答

20

默认情况下,Master Spark REST URL位于6066端口。因此,您应将其视为主端点:spark://172.20.9.151:6066。

如果您进入Spark web控制台(http://master:8080),您将获得集群各个端点的详细信息。


2
为什么需要REST URL?我们不是应该提供Spark、Master URL吗?通常运行在7077端口上的那个? - shanti

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接