SparkContext类未找到错误

4

我正在尝试使用IntelliJ在Spark上运行Scala代码。

Scala代码:

import scala.collection.JavaConverters._
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
  def main(args: Array[String]): Unit = {
    val sc = new SparkContext(new SparkConf().setAppName("anything").setMaster("localhost"))
    println("Hello World!")
  } 
}

POM(页面对象模型)
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>org.vocp</groupId>
<artifactId>SparkScalaConnect</artifactId>
<version>1.0-SNAPSHOT</version>

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<pluginRepositories>
    <pluginRepository>
        <id>scala-tools.org</id>
        <name>Scala-tools Maven2 Repository</name>
        <url>http://scala-tools.org/repo-releases</url>
    </pluginRepository>
</pluginRepositories>

<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.2.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.10.5</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>2.7.5</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

错误信息
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
at WordCount$.main(WordCount.scala:9)
at WordCount.main(WordCount.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

我尝试在Scala-IDE上运行相同的代码,它可以正常运行。我认为问题可能出在intellij编辑器上。

我陷入了这个问题很长一段时间。任何建议都会有益。

提前感谢。

注意 - 我正在使用1.7 JAVA Oracle和2.10.5 scala SDK运行代码。


在项目模块设置中,Spark Core 是否包含在库列表中? - FaigB
scala-core_2.10在项目的外部库中存在。这是你想要的吗? - wadhwasahil
https://github.com/hammadhaleem/SparkDemo/blob/master/DataReader.scala#L17将local => local[8],这样问题就解决了。 - Hammad Haleem
1个回答

5
尝试将Spark依赖项的范围从“provided”更改为“compile”问题

更改了所有依赖项的范围,一切都正常工作了。 - wadhwasahil

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接