Liking cljdoc? Tell your friends :D

metabase.mbql.normalize

Logic for taking any sort of weird MBQL query and normalizing it into a standardized, canonical form. You can think of this like taking any 'valid' MBQL query and rewriting it as-if it was written in perfect up-to-date MBQL in the latest version. There are four main things done here, done as four separate steps:

NORMALIZING TOKENS

Converting all identifiers to lower-case, lisp-case keywords. e.g. {"SOURCE_TABLE" 10} becomes {:source-table 10}.

CANONICALIZING THE QUERY

Rewriting deprecated MBQL 95/98 syntax and other things that are still supported for backwards-compatibility in canonical MBQL 2000 syntax. For example {:breakout [:count 10]} becomes {:breakout [[:count [:field-id 10]]]}.

WHOLE-QUERY TRANSFORMATIONS

Transformations and cleanup of the query structure as a whole to fix inconsistencies. Whereas the canonicalization phase operates on a lower-level, transforming invidual clauses, this phase focuses on transformations that affect multiple clauses, such as removing duplicate references to Fields if they are specified in both the :breakout and :fields clauses.

This is not the only place that does such transformations; several pieces of QP middleware perform similar individual transformations, such as reconcile-breakout-and-order-by-bucketing.

REMOVING EMPTY CLAUSES

Removing empty clauses like {:aggregation nil} or {:breakout []}.

Token normalization occurs first, followed by canonicalization, followed by removing empty clauses.

Logic for taking any sort of weird MBQL query and normalizing it into a standardized, canonical form. You can think
of this like taking any 'valid' MBQL query and rewriting it as-if it was written in perfect up-to-date MBQL in the
latest version. There are four main things done here, done as four separate steps:

#### NORMALIZING TOKENS

Converting all identifiers to lower-case, lisp-case keywords. e.g. `{"SOURCE_TABLE" 10}` becomes `{:source-table
10}`.

#### CANONICALIZING THE QUERY

Rewriting deprecated MBQL 95/98 syntax and other things that are still supported for backwards-compatibility in
canonical MBQL 2000 syntax. For example `{:breakout [:count 10]}` becomes `{:breakout [[:count [:field-id 10]]]}`.

#### WHOLE-QUERY TRANSFORMATIONS

Transformations and cleanup of the query structure as a whole to fix inconsistencies. Whereas the canonicalization
phase operates on a lower-level, transforming invidual clauses, this phase focuses on transformations that affect
multiple clauses, such as removing duplicate references to Fields if they are specified in both the `:breakout` and
`:fields` clauses.

This is not the only place that does such transformations; several pieces of QP middleware perform similar
individual transformations, such as `reconcile-breakout-and-order-by-bucketing`.

#### REMOVING EMPTY CLAUSES

Removing empty clauses like `{:aggregation nil}` or `{:breakout []}`.

Token normalization occurs first, followed by canonicalization, followed by removing empty clauses.
raw docstring

is-clause?clj

(is-clause? k-or-ks x)

If x an MBQL clause, and an instance of clauses defined by keyword(s) k-or-ks?

(is-clause? :count [:count 10]) ; -> true (is-clause? #{:+ :- :* :/} [:+ 10 20]) ; -> true

(This is different from the implementation in mbql.u because it also supports un-normalized clauses. You shouldn't need to use this outside of this namespace.)

If `x` an MBQL clause, and an instance of clauses defined by keyword(s) `k-or-ks`?

  (is-clause? :count [:count 10])        ; -> true
  (is-clause? #{:+ :- :* :/} [:+ 10 20]) ; -> true

(This is different from the implementation in `mbql.u` because it also supports un-normalized clauses. You shouldn't
need to use this outside of this namespace.)
sourceraw docstring

normalizeclj

(normalize outer-query)

Normalize the tokens in a Metabase query (i.e., make them all lisp-case keywords), rewrite deprecated clauses as up-to-date MBQL 2000, and remove empty clauses.

Normalize the tokens in a Metabase query (i.e., make them all `lisp-case` keywords), rewrite deprecated clauses as
up-to-date MBQL 2000, and remove empty clauses.
sourceraw docstring

normalize-fragmentclj

(normalize-fragment path x)

Normalize just a specific fragment of a query, such as just the inner MBQL part or just a filter clause. path is where this fragment would normally live in a full query.

(normalize-fragment [:query :filter] ["=" 100 200]) ;;-> [:= [:field-id 100] 200]

Normalize just a specific fragment of a query, such as just the inner MBQL part or just a filter clause. `path` is
where this fragment would normally live in a full query.

  (normalize-fragment [:query :filter] ["=" 100 200])
  ;;-> [:= [:field-id 100] 200]
sourceraw docstring

normalize-tokensclj

(normalize-tokens x & [path])

Recursively normalize tokens in x.

Every time this function recurses (thru a map value) it adds a new (normalized) key to key path, e.g. path will be [:query :order-by] when we're in the MBQL order-by clause. If we need to handle these top-level clauses in special ways add a function to path->special-token-normalization-fn above.

In some cases, dealing with the path isn't desirable, but we don't want to accidentally trigger normalization functions (such as accidentally normalizing the :type key in something other than the top-level of the query), so by convention please pass :ignore-path to avoid accidentally triggering path functions.

Recursively normalize tokens in `x`.

Every time this function recurses (thru a map value) it adds a new (normalized) key to key path, e.g. `path` will be
`[:query :order-by]` when we're in the MBQL order-by clause. If we need to handle these top-level clauses in special
ways add a function to `path->special-token-normalization-fn` above.

In some cases, dealing with the path isn't desirable, but we don't want to accidentally trigger normalization
functions (such as accidentally normalizing the `:type` key in something other than the top-level of the query), so
by convention please pass `:ignore-path` to avoid accidentally triggering path functions.
sourceraw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close