我正在使用Java API与GraphX和Pregel进行工作。我尝试实现一个最大值算法(给定一个加权图,输出是最大权重)。但我的实现不起作用:
public class Main {
public static void main(String[] args){
SparkConf conf = new SparkConf().setAppName("MaxValue").setMaster("spark://home:7077");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> text_file = sc.textFile(args[0]);
JavaRDD<String[]> text_file_arr = text_file.map(l -> l.split(" "));
//cache
text_file_arr.cache();
//create the vertex RDD
RDD<Tuple2<Object, Integer>> verteces = text_file_arr.map(
t-> new Tuple2<>((Object) Long.parseLong(t[0]), Integer.parseInt(t[t.length-1]))
).rdd();
//create edge RDD
RDD<Edge<Boolean>> edges = text_file_arr
.flatMap( l -> {
List<Edge<Boolean>> edgeList = new ArrayList<>();
long src = Long.parseLong(l[0]);
for (int i = 1;i<l.length-1;++i){
edgeList.add(new Edge(src,Long.parseLong(l[i]),true));
}
return edgeList.iterator();
})
.rdd();
//create the graph
Graph<Integer,Boolean> graph = Graph.apply(
verteces,
edges,
Integer.MIN_VALUE,
StorageLevel.MEMORY_AND_DISK(),
StorageLevel.MEMORY_AND_DISK(),
ClassTag$.MODULE$.apply(Integer.class),
ClassTag$.MODULE$.apply(Boolean.class)
);
graph.edges().toJavaRDD().collect().forEach(System.out::print);
graph.vertices().toJavaRDD().collect().forEach(System.out::print);
GraphOps<Integer,Boolean> graph_ops = new GraphOps<>(
graph,
ClassTag$.MODULE$.apply(Integer.class),
ClassTag$.MODULE$.apply(Boolean.class)
);
//run pregel
Graph<Integer,Boolean> graph_pregel = graph_ops.pregel(
Integer.MIN_VALUE,
3,
EdgeDirection.Either(),
new VProg(),
new SendMsg(),
new Merge(),
ClassTag$.MODULE$.apply(Integer.class)
);
graph_pregel.vertices().toJavaRDD().saveAsTextFile("out");
}
}
这些是VProg、SendMsg和Merge类。
class SendMsg extends AbstractFunction1<EdgeTriplet<Integer,Boolean>, Iterator<Tuple2<Object, Integer>>> implements Serializable {
@Override
public Iterator<Tuple2<Object, Integer>> apply(EdgeTriplet<Integer, Boolean> et) {
System.out.println(et.srcId()+" ---> "+et.dstId()+" with: "+et.srcAttr()+" ---> "+et.dstId());
if (et.srcAttr() > et.dstAttr()) {
return JavaConverters.asScalaIteratorConverter(Arrays.asList(et.toTuple()._1()).iterator()).asScala();
}else{
return JavaConverters.asScalaIteratorConverter(new ArrayList<Tuple2<Object, Integer>>().iterator()).asScala();
}
}
}
class VProg extends AbstractFunction3<Object, Integer, Integer, Integer> implements Serializable{
@Override
public Integer apply(Object l, Integer treeNodeThis, Integer treeNodeIn) {
if (treeNodeThis > treeNodeIn) {
System.out.println(l + " : " + treeNodeThis);
return treeNodeThis;
} else {
System.out.println(l + " : " + treeNodeIn);
return treeNodeIn;
}
}
}
class Merge extends AbstractFunction2<Integer, Integer, Integer> implements Serializable{
@Override
public Integer apply(Integer n1, Integer n2) {
return (n1>n2)? n1:n2;
}
}
问题在于,当节点运行VProg后,会执行SendMsg但是数值没有更新。这意味着,VProg返回新值,但图仍然是输入的图形。我也尝试了其他算法,但出现了相同的问题。也许我编写的类VProg,SendMsg或Merge有误?
该图与7个节点连接,并且每个节点的值都为2^nodenumber。
我也尝试使用Pregel类,但还是有同样的问题... 我正在使用Spark 2.0.0和Java 8。