如何在Spark中将数组转换为列?

3

I have a dataframe which looks like this below

+-----------------------------+
|       Item                  |
+-----------------------------+
|[[a,b,c], [d,e,f], [g,h,i]]  |
+--------------------+--------+

如何将它转换成下面的表格?
a b c
d e f
g h i

我尝试使用explodewithColumn函数。

a b c
a e c
a h c
d b c
d e c
d h c
... (many other combinations)
2个回答

2

你需要将仅爆炸第一层的数组,然后就可以选择数组元素作为列:

import pyspark.sql.functions as F

df = spark.createDataFrame(
    [([["a","b","c"], ["d","e","f"], ["g","h","i"]],)],
    ["Item"]
)

df.withColumn("Item", F.explode("Item")).select(
    *[F.col("Item")[i].alias(f"col_{i}") for i in range(3)]
).show()

#+-----+-----+-----+
#|col_0|col_1|col_2|
#+-----+-----+-----+
#|    a|    b|    c|
#|    d|    e|    f|
#|    g|    h|    i|
#+-----+-----+-----+

1

@blackbishop 改进你的回答...

import pyspark.sql.functions as F

df = spark.createDataFrame(
    [([["a","b","c"], ["d","e","f"], ["g","h","i", "j"]],)],
    ["data"]
)

df.show(20, False)

df = df.withColumn("data1", F.explode("data"))
df.select('data1').show()

# Row(max(size(data1))=4) ---> 4
max_size = df.select(F.max(F.size('data1'))).collect()[0][0]


df.select(
    *[F.col("data1")[i].alias(f"col_{i}") for i in range(max_size)]
).show()



+------------------------------------+
|data                                |
+------------------------------------+
|[[a, b, c], [d, e, f], [g, h, i, j]]|
+------------------------------------+

+------------+
|       data1|
+------------+
|   [a, b, c]|
|   [d, e, f]|
|[g, h, i, j]|
+------------+

+-----+-----+-----+-----+
|col_0|col_1|col_2|col_3|
+-----+-----+-----+-----+
|    a|    b|    c| null|
|    d|    e|    f| null|
|    g|    h|    i|    j|
+-----+-----+-----+-----+

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接