如何使用Spark将Scala列表持久化到MongoDB?

3
我有一段Spark代码,从mongodb中提取一些文档,进行一些转换,并尝试将其存回mongodb。当我尝试使用以下函数持久化List对象时出现问题:首先使用该函数生成一些元组:
val usersRDD = rdd.flatMap( breakoutFileById ).distinct().groupByKey().mapValues(_.toList)

然后我使用自定义的mapToDocument函数将元组字段转换为文档,并调用saveToMongoDB函数:
usersRDD.map( mapToDocument ).saveToMongoDB()

我收到了以下错误信息:
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class scala.collection.immutable.$colon$colon.
    at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46)
    at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63)
    at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51)
    at org.bson.codecs.DocumentCodec.writeValue(DocumentCodec.java:174)
    at org.bson.codecs.DocumentCodec.writeMap(DocumentCodec.java:189)
    at org.bson.codecs.DocumentCodec.encode(DocumentCodec.java:131)
    at org.bson.codecs.DocumentCodec.encode(DocumentCodec.java:45)
    at org.bson.codecs.BsonDocumentWrapperCodec.encode(BsonDocumentWrapperCodec.java:63)
    at org.bson.codecs.BsonDocumentWrapperCodec.encode(BsonDocumentWrapperCodec.java:29)
    at com.mongodb.connection.InsertCommandMessage.writeTheWrites(InsertCommandMessage.java:101)
    at com.mongodb.connection.InsertCommandMessage.writeTheWrites(InsertCommandMessage.java:43)
    at com.mongodb.connection.BaseWriteCommandMessage.encodeMessageBodyWithMetadata(BaseWriteCommandMessage.java:129)
    at com.mongodb.connection.RequestMessage.encodeWithMetadata(RequestMessage.java:160)
    at com.mongodb.connection.WriteCommandProtocol.sendMessage(WriteCommandProtocol.java:212)
    at com.mongodb.connection.WriteCommandProtocol.execute(WriteCommandProtocol.java:101)
    at com.mongodb.connection.InsertCommandProtocol.execute(InsertCommandProtocol.java:67)
    at com.mongodb.connection.InsertCommandProtocol.execute(InsertCommandProtocol.java:37)
    at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:159)
    at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:286)
    at com.mongodb.connection.DefaultServerConnection.insertCommand(DefaultServerConnection.java:115)
    at com.mongodb.operation.MixedBulkWriteOperation$Run$2.executeWriteCommandProtocol(MixedBulkWriteOperation.java:455)
    at com.mongodb.operation.MixedBulkWriteOperation$Run$RunExecutor.execute(MixedBulkWriteOperation.java:646)
    at com.mongodb.operation.MixedBulkWriteOperation$Run.execute(MixedBulkWriteOperation.java:401)
    at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:179)
    at com.mongodb.operation.MixedBulkWriteOperation$1.call(MixedBulkWriteOperation.java:168)
    at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:230)
    at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:221)
    at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
    at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
    at com.mongodb.Mongo.execute(Mongo.java:781)
    at com.mongodb.Mongo$2.execute(Mongo.java:764)
    at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
    at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1$$anonfun$apply$1$$anonfun$apply$2.apply(MongoSpark.scala:132)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1$$anonfun$apply$1$$anonfun$apply$2.apply(MongoSpark.scala:132)
    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1$$anonfun$apply$1.apply(MongoSpark.scala:132)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1$$anonfun$apply$1.apply(MongoSpark.scala:131)
    at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:186)
    at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:184)
    at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
    at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
    at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154)
    at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
    at com.mongodb.spark.MongoConnector.withCollectionDo(MongoConnector.scala:184)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1.apply(MongoSpark.scala:131)
    at com.mongodb.spark.MongoSpark$$anonfun$save$1.apply(MongoSpark.scala:130)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
    at org.apache.spark.scheduler.Task.run(Task.scala:99)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

如果我在mapToDocument函数中删除列表(不将其作为文档字段),一切都能正常工作。我已经在互联网上搜索类似的问题,但没有找到适合的解决方案。有人知道如何解决吗?
提前致谢。
1个回答

3

来自文档中的不支持类型部分:

一些Scala类型(例如List)不受支持,应转换为其Java等效类型。要从Scala转换为本地类型,请包含以下导入语句以使用.asJava方法。

import scala.collection.JavaConverters._
import org.bson.Document

val documents = sc.parallelize(
  Seq(new Document("fruits", List("apples", "oranges", "pears").asJava))
)
MongoSpark.save(documents)

它们不受支持的原因是因为Mongo Spark Connector在此情况下使用Mongo Java Driver,因此没有使用Scala异步驱动程序的必要。 但是,这意味着对于RDD,您必须将其映射到受支持的Java类型。 当使用数据集时,这些转换会自动完成。


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接