我将尝试使用Java进行简单的Spark SQL编程。在程序中,我从Cassandra表中获取数据,将RDD转换为Dataset并显示数据。当我运行spark-submit命令时,会出现错误:
我的程序如下:
java.lang.ClassNotFoundException: org.apache.spark.internal.Logging
。我的程序如下:
SparkConf sparkConf = new SparkConf().setAppName("DataFrameTest")
.set("spark.cassandra.connection.host", "abc")
.set("spark.cassandra.auth.username", "def")
.set("spark.cassandra.auth.password", "ghi");
SparkContext sparkContext = new SparkContext(sparkConf);
JavaRDD<EventLog> logsRDD = javaFunctions(sparkContext).cassandraTable("test", "log",
mapRowTo(Log.class));
SparkSession sparkSession = SparkSession.builder().appName("Java Spark SQL").getOrCreate();
Dataset<Row> logsDF = sparkSession.createDataFrame(logsRDD, Log.class);
logsDF.show();
我的POM依赖关系如下:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.2</version>
</dependency>
</dependencies>
我的spark-submit
命令是:/home/ubuntu/spark-2.0.2-bin-hadoop2.7/bin/spark-submit --class "com.jtv.spark.dataframes.App" --master local[4] spark.dataframes-0.1-jar-with-dependencies.jar
如何解决这个错误?降级到1.5.2
不起作用,因为1.5.2
没有org.apache.spark.sql.Dataset
和org.apache.spark.sql.SparkSession
。
org.apache.spark.sql.Dataset
和org.apache.spark.sql.SparkSession
。 - khateeb<version>1.6.3</version>
;) Spark使用Guava和其他一些类库,可能会存在版本冲突。 - T. Gawęda