Params:
Result: Long
Value representing the current row. This can be used to specify the frame boundaries:
2.1.0
Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html
Timestamp: 2020-10-19T01:56:25.028Z
Params: Result: Long Value representing the current row. This can be used to specify the frame boundaries: 2.1.0 Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html Timestamp: 2020-10-19T01:56:25.028Z
(over column window-spec)
Params: (window: WindowSpec)
Result: Column
Defines a windowing column.
1.4.0
Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/Column.html
Timestamp: 2020-10-19T01:56:19.973Z
Params: (window: WindowSpec) Result: Column Defines a windowing column. 1.4.0 Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/Column.html Timestamp: 2020-10-19T01:56:19.973Z
Params:
Result: Long
Value representing the last row in the partition, equivalent to "UNBOUNDED FOLLOWING" in SQL. This can be used to specify the frame boundaries:
2.1.0
Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html
Timestamp: 2020-10-19T01:56:25.054Z
Params: Result: Long Value representing the last row in the partition, equivalent to "UNBOUNDED FOLLOWING" in SQL. This can be used to specify the frame boundaries: 2.1.0 Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html Timestamp: 2020-10-19T01:56:25.054Z
Params:
Result: Long
Value representing the first row in the partition, equivalent to "UNBOUNDED PRECEDING" in SQL. This can be used to specify the frame boundaries:
2.1.0
Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html
Timestamp: 2020-10-19T01:56:25.055Z
Params: Result: Long Value representing the first row in the partition, equivalent to "UNBOUNDED PRECEDING" in SQL. This can be used to specify the frame boundaries: 2.1.0 Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html Timestamp: 2020-10-19T01:56:25.055Z
(window {:keys [partition-by order-by range-between rows-between]})
Utility functions for defining window in DataFrames.
Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html
Timestamp: 2020-10-19T01:55:47.755Z
Utility functions for defining window in DataFrames. Source: https://spark.apache.org/docs/3.0.1/api/scala/org/apache/spark/sql/expressions/Window$.html Timestamp: 2020-10-19T01:55:47.755Z
(windowed options)
Shortcut to create WindowSpec that takes a map as the argument.
Expected keys: [:partition-by :order-by :range-between :rows-between]
Shortcut to create WindowSpec that takes a map as the argument. Expected keys: [:partition-by :order-by :range-between :rows-between]
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close