Multi-column split in Spark SQL

Using the Spark SQL window functions, I need to split several columns to run my data queries, as shown below:

val w = Window.partitionBy($"a").partitionBy($"b").rangeBetween(-100, 0)

I currently do not have a test environment (work on settings), but as a quick question, is it currently supported as part of the Spark SQL window functions or does it not work?

+4
source share
1 answer

This will not work. The second will partitionByoverwrite the first. Both section columns must be specified in one call:

val w = Window.partitionBy($"a", $"b").rangeBetween(-100, 0)
+9
source

All Articles