SBT无法导入Kafka编解码器类。

7

项目设置:

  • 1个生产者-将对象序列化并发送字节到Kafka
  • 1个Spark消费者-应使用kafka.serializer包中的DefaultDecoder来消耗字节

问题:

  • SBT正确导入库(kafka-clients + kafka_2.10),但无法在kafka_2.10 jar中找到任何类。
  • 似乎它正在错误的路径下搜索(org.apache.spark.streaming.kafka而不是org.apache.kafka)。

错误消息:

    object serializer is not a member of package org.apache.spark.streaming.kafka [error] 
import kafka.serializer.DefaultDecoder.

sbt-tree

    [info]   +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
    [info]   | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here 
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
    [info]   | | +-org.apache.kafka:kafka-clients:0.8.2.1

built.sbt:

  lazy val commonSettings = Seq(
  organization := "org.RssReaderDemo",
  version := "0.1.0",
  scalaVersion := "2.10.6"
)

resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"

val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"

// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"

val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"

val avro = "org.apache.avro" % "avro" % "1.8.0"

lazy val root = (project in file(".")).
  settings(commonSettings: _*).
  settings(
    libraryDependencies += spark,
    libraryDependencies += sparkStreaming,
    libraryDependencies += sparkStreamKafka,
    libraryDependencies += jacksonMapperAsl,
    libraryDependencies += scalactic,
    libraryDependencies += scalatest,
    libraryDependencies += avro
  )

在SBT中导致错误的代码:import kafka.serializer.DefaultDecoder - mds91
2个回答

21

这与SBT无关。你可能有类似以下代码:

import org.apache.spark.streaming._
import kafka.serializer.DefaultDecoder

因为 org.apache.spark.streaming.kafka 包已经存在,这个导入语句将解析为 org.apache.spark.streaming.kafka.serializer.DefaultDecoder。你可以使用以下代码导入正确的类: import _root_.kafka.serializer.DefaultDecoder。有关Scala导入的更多详细信息,请参见https://wiki.scala-lang.org/display/SYGN/Language+FAQs#LanguageFAQs-HowdoIimport


谢谢 - 看起来就是这样。 - mds91
谢谢。有了你的建议,我的代码已经编译成功了。 - Suresh
谢谢 - 这让我朝着正确的方向前进。我通过在导入中更加明确地指定了以下内容来解决了这个问题:import org.apache.spark.streaming.{Seconds, StreamingContext} import kafka.serializer.StringDecoder这是我的风格偏好,但两种方式都可以。 - reverend
谢谢这篇文章对我很有帮助。 - Ishan Kumar

0

在 "import org.apache.spark.streaming._" 之前,您需要先 "import kafka.serializer.StringDecoder"。导入的顺序可以解决这个问题。

正确的导入顺序如下:

import kafka.serializer.StringDecoder
import org.apache.spark.streaming._

异常 -

import org.apache.spark.streaming._
import kafka.serializer.StringDecoder

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接