如何在Spark 2.3 Java中通过传递列表参数选择数据集ds
的多列?
例如,以下方法可以正常工作:
ds.select("col1","col2","col3").show();
然而,这个方法失败了:
List<String> columns = Arrays.toList("col1","col2","col3");
ds.select(columns.toString()).show()
如何在Spark 2.3 Java中通过传递列表参数选择数据集ds
的多列?
例如,以下方法可以正常工作:
ds.select("col1","col2","col3").show();
然而,这个方法失败了:
List<String> columns = Arrays.toList("col1","col2","col3");
ds.select(columns.toString()).show()
spark 2.4.0
,您需要将List<String>
转换为Seq<String>
,并根据Spark文档使用selectExpr
。select
,则必须从列表中删除第一列,并将其作为参数添加到select
中。.csv
文件:InvoiceNo,StockCode,Description,Quantity,InvoiceDate,UnitPrice,CustomerID,Country
536365,85123A,WHITE HANGING HEART T-LIGHT HOLDER,6,2010-12-01 08:26:00,2.55,17850.0,United Kingdom
536365,71053,WHITE METAL LANTERN,6,2010-12-01 08:26:00,3.39,17850.0,United Kingdom
536365,84406B,CREAM CUPID HEARTS COAT HANGER,8,2010-12-01 08:26:00,2.75,17850.0,United Kingdom
536365,84029G,KNITTED UNION FLAG HOT WATER BOTTLE,6,2010-12-01 08:26:00,3.39,17850.0,United Kingdom
536365,84029E,RED WOOLLY HOTTIE WHITE HEART.,6,2010-12-01 08:26:00,3.39,17850.0,United Kingdom
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import java.util.Arrays;
import java.util.List;
import scala.collection.JavaConverters;
import scala.collection.Seq;
public class SparkJavaTest {
public static SparkSession spark = SparkSession
.builder()
.appName("JavaSparkTest")
.master("local")
.getOrCreate();
public static Seq<String> convertListToSeq(List<String> inputList) {
return JavaConverters.asScalaIteratorConverter(inputList.iterator()).asScala().toSeq();
}
public static void main(String[] args) {
Dataset<Row> ds = spark.read().option("header",true).csv("spark-file.csv");
List<String> columns = Arrays.asList("InvoiceNo","StockCode","Description");
//using selectExpr
ds.selectExpr(convertListToSeq(columns)).show(false);
//using select => this first column will be added to select
List<String> columns2 = Arrays.asList("StockCode","Description");
ds.select("InvoiceNo", convertListToSeq(columns2)).show(false);
}
}
Either use
Dataset<Row> select(String col, scala.collection.Seq<String> cols)
作为
Column column = "col1";
List<String> columns = Arrays.toList(""col2","col3");
ds.select(column, columns).show()
Dataset<Row> select(String col, String... cols)
as
List<Column> columns = Arrays.toList(col("col1"),col("col2"),col("col3"));
ds.select(columns);
List<Column> columns = Arrays.asList(col("col1"),col("col2"));
,但是出现了方法调用期望
的错误。 - ScalaBoytoList
方法。 - ScalaBoy