项目设置:
- 1个生产者-将对象序列化并发送字节到Kafka
- 1个Spark消费者-应使用kafka.serializer包中的DefaultDecoder来消耗字节
问题:
- SBT正确导入库(kafka-clients + kafka_2.10),但无法在kafka_2.10 jar中找到任何类。
- 似乎它正在错误的路径下搜索(org.apache.spark.streaming.kafka而不是org.apache.kafka)。
错误消息:
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
sbt-tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)