Spark 给我报了一个编译时错误。
Error:(49, 13) Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._ Support for serializing other types will be added in future releases.
.map(line => line.split(delimiter))
^
以下是代码:
val digital2 = sqlContext.read.text("path").as[String]
.map(line => line.split(delimiter))
.map(lineSplit => {
new MyType(lineSplit(0), lineSplit(1), lineSplit(2), lineSplit(3)
, lineSplit(4).toInt, lineSplit(5).toInt, lineSplit(6).toInt, lineSplit(7).toInt
)
})
但是这段代码完全可以正常运作。
val digital = sqlContext.read.text("path").as[String]
.map(line => {
val lineSplit = line.split(delimiter)
new MyType(lineSplit(0), lineSplit(1), lineSplit(2), lineSplit(3)
, lineSplit(4).toInt, lineSplit(5).toInt, lineSplit(6).toInt, lineSplit(7).toInt
)
}
我不明白正在发生什么。有人可以解释一下吗?