如何获取表头(模式)中的信息?

环保:spark2.4.5

来源:id-name.json

{"1": "a", "2": "b", "3":, "c"..., "n": "z"}

我将.json文件加载为Json格式的spark数据集,其存储方式如下:

+---+---+---+---+---+
| 1 | 2 | 3 |...| n |
+---+---+---+---+---+
| a | b | c |...| z |
+---+---+---+---+---+

我希望它像这样的结果生成:

+------------+------+
|     id     | name |
+------------+------+
| 1          | a    |
| 2          | b    |
| 3          | c    |
| .          | .    |
| .          | .    |
| .          | .    |
| n          | z    |
+------------+------+

我的解决方案使用spark-sql:

select stack(n, '1', `1`, '2', `2`... ,'n', `n`) as ('id', 'name') from table_name;

它不符合我的要求,因为我不想在sql中硬编码所有“ id”。 也许将“来自table_name的显示列”与“ stack()”一起使用可能会有所帮助? 如果您能给我一些建议,我将不胜感激。

评论
若相惜
若相惜

Create required values for stack dynamic & use it where ever it required. Please check below code to generate same values dynamic.

scala> val js = Seq("""{"1": "a", "2": "b","3":"c","4":"d","5":"e"}""").toDS
js: org.apache.spark.sql.Dataset[String] = [value: string]

scala> val df = spark.read.json(js)
df: org.apache.spark.sql.DataFrame = [1: string, 2: string ... 3 more fields]

scala> val stack = s"""stack(${df.columns.max},${df.columns.flatMap(c => Seq(s"'${c}'",s"`${c}`")).mkString(",")}) as (id,name)"""
exprC: String = stack(5,'1',`1`,'2',`2`,'3',`3`,'4',`4`,'5',`5`) as (id,name)

scala> df.select(expr(stack)).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+


scala> spark.sql(s"""select ${stack} from table """).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+


scala>

Updated Code to fetch data from json file

scala> "hdfs dfs -cat /tmp/vn50ftc/sample.json".!
{"1": "a", "2": "b","3":"c","4":"d","5":"e"}
res4: Int = 0

scala> val df = spark.read.json("/tmp/vn50ftc/sample.json")
df: org.apache.spark.sql.DataFrame = [1: string, 2: string ... 3 more fields]

scala> val stack = s"""stack(${df.columns.max},${df.columns.flatMap(c => Seq(s"'${c}'",s"`${c}`")).mkString(",")}) as (id,name)"""
stack: String = stack(5,'1',`1`,'2',`2`,'3',`3`,'4',`4`,'5',`5`) as (id,name)

scala> df.select(expr(stack)).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+

scala> df.createTempView("table")

scala> spark.sql(s"""select ${stack} from table """).show(false)
+---+----+
|id |name|
+---+----+
|1  |a   |
|2  |b   |
|3  |c   |
|4  |d   |
|5  |e   |
+---+----+

点赞
评论