(===> from)
(===> from to)
(==> from)
(==> from to)
(=> from)
(=> from to)
(compile-pattern pattern)
(compile-pattern pattern comp-env)
Compiles a pattern, returning a function that can be used in two different ways.
Arity 1 is the user-friendly version. Use it to match against a piece of data, returning either nil or a result map of name -> value. For example, this pattern will match an unordered multiply expression:
(let [find-unordered (compile-pattern '(* ?b (? a < ?b)))]
(find-unordered '(* 1 2)) ;; => nil
(find-unordered '(* 2 1))) ;; => {'b 2, 'a 1}
Experimental: Patterns may be altered and recompiled via a special call to the arity-1 matcher:
(let [p (compile-pattern '(+ 1 2 ?x))]
(p ^::recompile (fn [orig-matcher compile pattern comp-env]
(compile (reverse pattern) comp-env)))
(p '(9 2 1 +))) ;; => {x 9}
The recompile function takes 4 arguments and must have ::recompile in its metadata. This is to support progressive construction of rules. It does not facilitate rule reuse because the recompiled rules are mutated in place with the new matcher.
Compiles a pattern, returning a function that can be used in two different ways. Arity 1 is the user-friendly version. Use it to match against a piece of data, returning either nil or a result map of name -> value. For example, this pattern will match an unordered multiply expression: (let [find-unordered (compile-pattern '(* ?b (? a < ?b)))] (find-unordered '(* 1 2)) ;; => nil (find-unordered '(* 2 1))) ;; => {'b 2, 'a 1} Experimental: Patterns may be altered and recompiled via a special call to the arity-1 matcher: (let [p (compile-pattern '(+ 1 2 ?x))] (p ^::recompile (fn [orig-matcher compile pattern comp-env] (compile (reverse pattern) comp-env))) (p '(9 2 1 +))) ;; => {x 9} The recompile function takes 4 arguments and must have ::recompile in its metadata. This is to support progressive construction of rules. It does not facilitate rule reuse because the recompiled rules are mutated in place with the new matcher.
(def-derived name parent-dialect & decls)
Create a new dialect based on another one by defining terminals or forms that should be added or removed.
This works essentially the same as [[define-dialect]] but adds a parent-dialect argument and makes it possible to prevent inheritance of terminals, forms or expressions within forms from the parent dialect by prefixing each one with either + or -.
(def-derived D2 NS
(terminals + (symbol s)
- (nsname nsn)
- (typename tn))
- NsForm
(Expr [e]
+ (let [(?:* ?s:binding* ?e:bound*)] ??e:body)))
In the above example, 1 new terminal is added, and 2 are removed, the entire NsForm form is removed, and a new Expr form is added, adding a let binding. It is also possible to remove a pattern expression from a form by replacing the + with a -. Forms that are newly added may also omit all of the + symbols as well, but within a form all expressions must either be marked with +/- or not marked at all.
Create a new dialect based on another one by defining terminals or forms that should be added or removed. This works essentially the same as [[define-dialect]] but adds a parent-dialect argument and makes it possible to prevent inheritance of terminals, forms or expressions within forms from the parent dialect by prefixing each one with either + or -. (def-derived D2 NS (terminals + (symbol s) - (nsname nsn) - (typename tn)) - NsForm (Expr [e] + (let [(?:* ?s:binding* ?e:bound*)] ??e:body))) In the above example, 1 new terminal is added, and 2 are removed, the entire NsForm form is removed, and a new Expr form is added, adding a let binding. It is also possible to remove a pattern expression from a form by replacing the + with a -. Forms that are newly added may also omit all of the + symbols as well, but within a form all expressions must either be marked with +/- or not marked at all.
(def-dialect name & decls)
Create a new dialect.
Both terminals and forms may be defined. The following example creates a language with 2 terminals (nsn and tn) and two forms (nsform and ns):
(def-dialect NS (terminals [nsn nsname?] [tn typename?]) (NsForm [nsform] (:require (?:* (| ?nsn:req-symbol [?nsn:req-symbol ??opts]))) (:import (?:* (| ?nsn:fq-name (?nsn:ns-name ??tn:typenames))))) (Namespace [ns :enforce] ((?:literal ns) ?nsn:name ??nsform)) (entry Namespace))
The last form is the entry form for conformance unless a formis specifically designated by using (entry FormName).
By default, only terminals are predicated in matchers. If you want a form to be enforced, mark it with :enforce.
This is a somewhat sophisticated macro and as such has a bit of syntax you need to understand.
There are 2 top-level syntax types: terminals declarations and form declarations. Terminals are matched objects that do not require any further matching, ie. they are leaves of the syntax tree for the dialect. Forms are groups of patterns. In order for an IR instance to be valid, every form as it is recursively traversed must match one of the patterns and be tagged with the form type (ie. NsForm or Namespace in the above example).
Create a new dialect. Both terminals and forms may be defined. The following example creates a language with 2 terminals (nsn and tn) and two forms (nsform and ns): (def-dialect NS (terminals [nsn nsname?] [tn typename?]) (NsForm [nsform] (:require (?:* (| ?nsn:req-symbol [?nsn:req-symbol ??opts]))) (:import (?:* (| ?nsn:fq-name (?nsn:ns-name ??tn:typenames))))) (Namespace [ns :enforce] ((?:literal ns) ?nsn:name ??nsform)) (entry Namespace)) The last form is the entry form for conformance unless a formis specifically designated by using (entry FormName). By default, only terminals are predicated in matchers. If you want a form to be enforced, mark it with :enforce. This is a somewhat sophisticated macro and as such has a bit of syntax you need to understand. There are 2 top-level syntax types: terminals declarations and form declarations. Terminals are matched objects that do not require any further matching, ie. they are leaves of the syntax tree for the dialect. Forms are groups of patterns. In order for an IR instance to be valid, every form as it is recursively traversed must match one of the patterns and be tagged with the form type (ie. NsForm or Namespace in the above example).
(defpass name dialects compiled & fn-tail)
Define a pass with the given name and dialects. Use
(=> FromDialect ToDialect)
to specify the dialect pair.
This may be used either as a simple pass definition with no function body defined, in
which case it will define name
to be the value of the compiled
argument, which may
be a rule, let-rulefn
, or any other expression.
Alternately, if a fn-tail is defined then the compiled
expression must contain <>
exactly once in the location you want the function definition to be inserted. The
point of this exercise is to allow the rule functions to be precompiled while
still allowing a clear syntax for defining a pass.
Example usage:
(defpass naive-cps (=> LambdaCalc CPS)
(let-rulefn [(M (=> Expr MExpr)
[(rule '(fn [?var] ?expr)
(let [k (gensym 'k)]
(sub (fn [?var ?k] ~(T expr k)))))
(rule '(? s symbol?) s)])
(T* (=> Expr TExpr) [cont]
[(rule '(?:as expr (fn ??_))
`(~cont ~(M expr)))
(rule '(? s symbol?)
`(~cont ~(M s)))
(rule '(?f ?e)
(let [fs (gensym 'f)
es (gensym 'e)]
(T f (sub (fn [?fs]
~(T e (sub (fn [?es]
(?fs ?es ?cont)))))))))])
(fn T [expr cont]
(first (T* expr {:cont cont})))]
<>)
[expr cont]
(T expr cont))
(naive-cps '(g a) ''halt)
;; => ((fn [f48299] ((fn [e48300] (f48299 e48300 (quote halt))) a)) g)
In the above example, M
is a simple rule, but T*
uses its env to get the
value of cont
. The [cont]
clause in the T* definition causes the rule
handler to be wrapped in (let [cont (:cont %env)] ...).
At the end of the definition, the [expr cont] (T expr cont)
expression gets
wrapped with (defn naive-cps ...)
, and the defn is placed at the point
indicated by <>
.
LambdaCalc and CPS are the dialects being transformed between. Expr is a form in LambdaCalc and MExpr and TExpr are forms in CPS.
Define a pass with the given name and dialects. Use `(=> FromDialect ToDialect)` to specify the dialect pair. This may be used either as a simple pass definition with no function body defined, in which case it will define `name` to be the value of the `compiled` argument, which may be a rule, [[let-rulefn]], or any other expression. Alternately, if a fn-tail is defined then the `compiled` expression must contain `<>` exactly once in the location you want the function definition to be inserted. The point of this exercise is to allow the rule functions to be precompiled while still allowing a clear syntax for defining a pass. Example usage: (defpass naive-cps (=> LambdaCalc CPS) (let-rulefn [(M (=> Expr MExpr) [(rule '(fn [?var] ?expr) (let [k (gensym 'k)] (sub (fn [?var ?k] ~(T expr k))))) (rule '(? s symbol?) s)]) (T* (=> Expr TExpr) [cont] [(rule '(?:as expr (fn ??_)) `(~cont ~(M expr))) (rule '(? s symbol?) `(~cont ~(M s))) (rule '(?f ?e) (let [fs (gensym 'f) es (gensym 'e)] (T f (sub (fn [?fs] ~(T e (sub (fn [?es] (?fs ?es ?cont)))))))))]) (fn T [expr cont] (first (T* expr {:cont cont})))] <>) [expr cont] (T expr cont)) (naive-cps '(g a) ''halt) ;; => ((fn [f48299] ((fn [e48300] (f48299 e48300 (quote halt))) a)) g) In the above example, `M` is a simple rule, but `T*` uses its env to get the value of `cont`. The `[cont]` clause in the T* definition causes the rule handler to be wrapped in (let [cont (:cont %env)] ...). At the end of the definition, the `[expr cont] (T expr cont)` expression gets wrapped with `(defn naive-cps ...)`, and the defn is placed at the point indicated by `<>`. LambdaCalc and CPS are the dialects being transformed between. Expr is a form in LambdaCalc and MExpr and TExpr are forms in CPS.
(descend expression)
(descend expression env)
If passing in an env, pass it as the first arg since within a rule handler, the expression part is likely to be a large hairy expression, and the env aspect will be easily lost at the end of it.
If passing in an env, pass it as the first arg since within a rule handler, the expression part is likely to be a large hairy expression, and the env aspect will be easily lost at the end of it.
(descend-all e*)
(descend-all e* env)
Descend each element in e*, threading the env and returning the result.
Like descend, if called without env it just returns the resulting expression and doesn't return the env, but if called with an env, it returns [result env].
An alternative strategy would be to merge the resulting envs, but that could require a custom merge strategy, so isn't provided as a built-in helper.
Descend each element in e*, threading the env and returning the result. Like descend, if called without env it just returns the resulting expression and doesn't return the env, but if called with an env, it returns [result env]. An alternative strategy would be to merge the resulting envs, but that could require a custom merge strategy, so isn't provided as a built-in helper.
(descend-into dialect)
(descend-into dialect forms)
(descend-into dialect forms descend-abbrs)
Creates a rule-list based on the valid expressions in the given dialect. The rules do not make any change to the expressions they match, but they enable correct descent through those expressions.
Each form
in the dialect has a list of expressions. You can either specify
a list of forms to include, or specify :all
.
Descent through the forms is based on the abbr
of the expression. If a
form's abbreviation is included in the list of descend-abbrs
, then for each
included expression, the vars that have that abbr will be descended through.
If no descend-abbr is provided, the abbr of each selected form will be used.
Note that terminals are never included in the list by default, but sometimes
it may be useful to include them in the descend-abbrs
list.
Example dialect
(def-dialect D1
(Exp [e] (if ?e:cond ?e:then ?e:else) (prg ?p ??e*))
(Program [p] (program ?e))
Example usages
(descend-into D1)
;; => rule list with 3 rules, descending into e and p abbrs.
(descend-into D1 '[Program])
;; => rule list just matching (program ?e), but only descending into p
;; abbrs, so really does nothing.
(descend-into D1 '[Exp] '[p])
;; => rule list matching the 2 forms in Exp, but only descending into
;; p abbrs. Equivalent to:
(rule-list
(rule '(if ?e0 ?e1 ?e2)) ;; does nothing but prevents other rules from matching
(rule '(prg ?->p ??e*))) ;; descends into ?->p but otherwise makes no change
Creates a rule-list based on the valid expressions in the given dialect. The rules do not make any change to the expressions they match, but they enable correct descent through those expressions. Each `form` in the dialect has a list of expressions. You can either specify a list of forms to include, or specify `:all`. Descent through the forms is based on the `abbr` of the expression. If a form's abbreviation is included in the list of `descend-abbrs`, then for each included expression, the vars that have that abbr will be descended through. If no descend-abbr is provided, the abbr of each selected form will be used. Note that terminals are never included in the list by default, but sometimes it may be useful to include them in the `descend-abbrs` list. Example dialect (def-dialect D1 (Exp [e] (if ?e:cond ?e:then ?e:else) (prg ?p ??e*)) (Program [p] (program ?e)) Example usages (descend-into D1) ;; => rule list with 3 rules, descending into e and p abbrs. (descend-into D1 '[Program]) ;; => rule list just matching (program ?e), but only descending into p ;; abbrs, so really does nothing. (descend-into D1 '[Exp] '[p]) ;; => rule list matching the 2 forms in Exp, but only descending into ;; p abbrs. Equivalent to: (rule-list (rule '(if ?e0 ?e1 ?e2)) ;; does nothing but prevents other rules from matching (rule '(prg ?->p ??e*))) ;; descends into ?->p but otherwise makes no change
(dialects =>dialects & body)
Wrap a given rule combinator definition to specify that those rules transform between the given pair of dialects.
The rules will also make use of all abbr predicates defined within the rule (either terminals or expressions that are marked with :enforce).
Wrap a given rule combinator definition to specify that those rules transform between the given pair of dialects. The rules will also make use of all abbr predicates defined within the rule (either terminals or expressions that are marked with :enforce).
(directed rule)
(directed opts raw-rule)
Recurs depth-first, but only into marked subexpressions.
Marking a subexpression looks like ?->x or ??->x (ie. marked with -> matcher mode), so a matcher like ?y would not get recurred into.
Does not iteratively descend into any expressions returned by matchers. To do
any iterative descent, call descend
within the handler on the subexpressions
you wish to descend into.
You can also use opts to mark vars to descend by :name, :prefix or
:abbr. Look at your rule metadata to see how the var names get that info
extracted. For example to descend all rules that have an abbr of e
, use
{:descend {:abbr #{'e}}}
Which would cause all descend the same as the following rule even if that rule had no -> markings:
(rule '[?->e ?->e0 ?->e123 ?no (?-> e*) ?->e:ok ?e-no ?e0:no])
You can provide an optional :fn-map via the opts argument, which is a map from additional mode symbols to functions that are applied to a captured match before it is passed to the rule handler. Only one function per symbol is allowed.
If a function is provided as the opts argument, it is treated as if you had passed in {:fn-map {'>- f}}, and if subexpressions are marked with >-, the expression, or the result of traversing into the expression if it is also marked with -> , will be passed to the function f. If no function is provided, [[identity]] is used. In this case, the matcher would look like one of ?>-, ??>-, ?>-> (note this is a shortened form), ?>-->, ??->>-, etc. The order of
- and -> does not matter. If any other symbols other than >- are provided in the :fn-map key of opts, the above description applies with the symbol you used.
You can provide a function on the :on-rule-meta opts key to make any arbitrary changes to rule metadata. The default is:
(fn on-rule-meta [rule-meta-before rule-meta-after]
rule-meta-after)
The rule argument is typically a rule-list of simple rules, but in theory any type of rule combinator should work, however determining the resulting behavior may be tricky in some cases...
Recurs depth-first, but only into marked subexpressions. Marking a subexpression looks like ?->x or ??->x (ie. marked with -> matcher mode), so a matcher like ?y would not get recurred into. Does not iteratively descend into any expressions returned by matchers. To do any iterative descent, call `descend` within the handler on the subexpressions you wish to descend into. You can also use opts to mark vars to descend by :name, :prefix or :abbr. Look at your rule metadata to see how the var names get that info extracted. For example to descend all rules that have an abbr of `e`, use {:descend {:abbr #{'e}}} Which would cause all descend the same as the following rule even if that rule had no -> markings: (rule '[?->e ?->e0 ?->e123 ?no (?-> e*) ?->e:ok ?e-no ?e0:no]) You can provide an optional :fn-map via the opts argument, which is a map from additional mode symbols to functions that are applied to a captured match before it is passed to the rule handler. Only one function per symbol is allowed. If a function is provided as the opts argument, it is treated as if you had passed in {:fn-map {'>- f}}, and if subexpressions are marked with >-, the expression, or the result of traversing into the expression if it is also marked with -> , will be passed to the function f. If no function is provided, [[identity]] is used. In this case, the matcher would look like one of ?>-, ??>-, ?>-> (note this is a shortened form), ?>-->, ??->>-, etc. The order of >- and -> does not matter. If any other symbols other than >- are provided in the :fn-map key of opts, the above description applies with the symbol you used. You can provide a function on the :on-rule-meta opts key to make any arbitrary changes to rule metadata. The default is: (fn on-rule-meta [rule-meta-before rule-meta-after] rule-meta-after) The rule argument is typically a rule-list of simple rules, but in theory any type of rule combinator should work, however determining the resulting behavior may be tricky in some cases...
(eval-spliced x)
Experimental. Uses spliced
to transform regular lists, then uses eval to
resolve spliced data. Doesn't resolve any data in the local scope.
Experimental. Uses [[spliced]] to transform regular lists, then uses eval to resolve spliced data. Doesn't resolve any data in the local scope.
(from-dialect dialect & body)
Wrap a given rule combinator with the dialect. See dialects
Wrap a given rule combinator with the dialect. See `dialects`
(in x env)
Descend with an env without retaining the resulting env.
Descend with an env without retaining the resulting env.
(in-order opts & rules)
Runs each of the rules in the list in a chain. If any rule succeeds, the subsequent rules are run with the new value. If a rule fails, the current value does not change and the next rule is run.
Each rule can itself be any rule-combinator.
opts:
:equiv? default: [[equiv?]]
Runs each of the rules in the list in a chain. If any rule succeeds, the subsequent rules are run with the new value. If a rule fails, the current value does not change and the next rule is run. Each rule can itself be any rule-combinator. opts: :equiv? default: [[equiv?]]
(iterated the-rule)
(iterated equiv? the-rule)
Run the given rule combinator repeatedly until running the rule makes no further changes.
Run the given rule combinator repeatedly until running the rule makes no further changes.
(let-rulefn rulefns & body)
This does a simple transformation to allow all of the rules in rule-fns to
refer to each other. Each rule list is made directed
, and all of the rule
lists are assembled into an on-mutual
set of rules via [[combine-rules]].
The combined ruleset is bound to %pass
.
If no body expressions are provided, this will return %pass
, otherwise it
will return the value of the last expression in the body.
When used inside defpass
, this call is rewritten to use [[let-rulefn*]] with
the dialect pair specified in the pass. If you don't want this behaviour, use
[[let-rulefn*]] directly with nil for the dialects argument.
This does a simple transformation to allow all of the rules in rule-fns to refer to each other. Each rule list is made [[directed]], and all of the rule lists are assembled into an [[on-mutual]] set of rules via [[combine-rules]]. The combined ruleset is bound to `%pass`. If no body expressions are provided, this will return `%pass`, otherwise it will return the value of the last expression in the body. When used inside [[defpass]], this call is rewritten to use [[let-rulefn*]] with the dialect pair specified in the pass. If you don't want this behaviour, use [[let-rulefn*]] directly with nil for the dialects argument.
(listy? x)
Returns true if x is any kind of list except a vector.
Returns true if x is any kind of list except a vector.
(mark-success rule value _ env _)
Capture in the env that the rule succeeded.
Capture in the env that the rule succeeded.
(match? pattern)
(match? pattern datum)
Like matcher
but simply returns true if matched successfully.
Like [[matcher]] but simply returns true if matched successfully.
(matcher pattern)
(matcher pattern datum)
Compiles (and optionally executes) a matcher pattern.
The result is either nil if no match is made or a list of matches for each variable in the pattern in the order they are defined.
(let [find-unordered (matcher '(* ?b (? a < ?b)))] (find-unordered '(* 1 2)) ;; => nil (find-unordered '(* 2 1))) ;; => '(2 1)
This style is useful for short or simple patterns but it becomes more
challenging to maintain matcher ordering between the pattern and the result as
the pattern complexity increases. To instead receive a dictionary of matches,
use compile-pattern
instead, which returns a function that, when called
with just 1 argument returns either a dictionary of matches or nil.
The compilation and execution process for this function and
compile-pattern
is identical.
Compiles (and optionally executes) a matcher pattern. The result is either nil if no match is made or a list of matches for each variable in the pattern in the order they are defined. (let [find-unordered (matcher '(* ?b (? a < ?b)))] (find-unordered '(* 1 2)) ;; => nil (find-unordered '(* 2 1))) ;; => '(2 1) This style is useful for short or simple patterns but it becomes more challenging to maintain matcher ordering between the pattern and the result as the pattern complexity increases. To instead receive a dictionary of matches, use [[compile-pattern]] instead, which returns a function that, when called with just 1 argument returns either a dictionary of matches or nil. The compilation and execution process for this function and [[compile-pattern]] is identical.
(merge-metadata & forms)
Attach a post processor that will merge the original value's metadata into the new value's metadata.
If a merge strategy is attached to the new value as :rule/merge-meta, use that fn to do the merge. The :rule/merge-meta key will be removed from the resulting metadata.
Attach a post processor that will merge the original value's metadata into the new value's metadata. If a merge strategy is attached to the new value as :rule/merge-meta, use that fn to do the merge. The :rule/merge-meta key will be removed from the resulting metadata.
(name-rule name rule)
Attach a rule name to the given object's metadata.
Attach a rule name to the given object's metadata.
(ok? x)
(on-mutual initial-form name-rule-pairs)
(on-mutual equiv? initial-form name-rule-pairs)
The idea is that you can create a group of named rule sets where matchers are tagged with metadata and a matcher mode that tells this system to switch which rule set is applied for subexpressions of the given type. Effectively this lets you switch between expression types (or dialects?) when applying rules to an expression.
This is currently done in a somewhat simplistic way with bound variables because I'm not exactly sure how it should be structured but eventually it should be done without the need for extra global state like this.
The idea is that you can create a group of named rule sets where matchers are tagged with metadata and a matcher mode that tells this system to switch which rule set is applied for subexpressions of the given type. Effectively this lets you switch between expression types (or dialects?) when applying rules to an expression. This is currently done in a somewhat simplistic way with bound variables because I'm not exactly sure how it should be structured but eventually it should be done without the need for extra global state like this.
(on-subexpressions the-rule)
(on-subexpressions equiv? the-rule)
Run the given rule combinator on all subexpressions depth-first.
Run the given rule combinator on all subexpressions depth-first.
(pattern-names pattern)
Return a list of all of the variable names defined in the pattern in the
order the values will be returned when using matcher
.
(let [find-unordered (matcher '(* ?b (? a < ?b)))]
(pattern-names find-unordered)) ;; => (b a)
This may be either passed a pattern directly or a pattern compiled either by
compile-pattern
or matcher
Return a list of all of the variable names defined in the pattern in the order the values will be returned when using [[matcher]]. (let [find-unordered (matcher '(* ?b (? a < ?b)))] (pattern-names find-unordered)) ;; => (b a) This may be either passed a pattern directly or a pattern compiled either by [[compile-pattern]] or [[matcher]]
(post-processors)
Get the currently active default post-processors
Get the currently active default post-processors
(prewalk-simplifier the-rule)
(prewalk-simplifier walk the-rule)
(prewalk-simplifier equiv? walk the-rule)
Run the given rule combinator repeatedly, then continue on a prewalk descent of all subexpressions until running the rule makes no further changes at each level.
This is the same strategy that Clojure's macroexpansion uses.
You can provide a [[walk]] argument to use a custom variant of clojure.walk/walk.
Run the given rule combinator repeatedly, then continue on a prewalk descent of all subexpressions until running the rule makes no further changes at each level. This is the same strategy that Clojure's macroexpansion uses. You can provide a [[walk]] argument to use a custom variant of clojure.walk/walk.
(quo expr)
Remove symbol namespaces.
Useful for cleaning up namespaces in syntax-quoted input data. Otherwise, use
sub
.
Requires that the expression is syntax quoted. Does not perform any other transformation.
Usage:
(quo `(expt x ~(+ 1 1)))
Remove symbol namespaces. Useful for cleaning up namespaces in syntax-quoted input data. Otherwise, use [[sub]]. Requires that the expression is syntax quoted. Does not perform any other transformation. Usage: (quo `(expt x ~(+ 1 1)))
(raw & forms)
Don't attach any additional post-processing to rules defined within this form
If post processors are attached within the raw form, they will remain.
Don't attach any additional post-processing to rules defined within this form If post processors are attached within the raw form, they will remain.
(rebuild-rule rule
pattern
handler-body
handler-injection-names
handler-injection-data)
Update either the pattern or the handler body (or both) of the given rule.
Both the pattern and the handler-body must be quoted (unlike in rule
, where
the handler-body is not quoted. This is to allow programmatic manipulation
of the existing handler body, or otherwise generating it. The current version
of both is present in the rule metadata.
When rebuilding a rule using eval, anything that may contain local state must be injected. In the handler function, refer to data that will be injected with normal symbols. Provide those symbols as a vector of injection-names. The corresponding data to be injected should be in the same order in injection-data.
Update either the pattern or the handler body (or both) of the given rule. Both the pattern and the handler-body must be quoted (unlike in [[rule]], where the handler-body is not quoted. This is to allow programmatic manipulation of the existing handler body, or otherwise generating it. The current version of both is present in the rule metadata. When rebuilding a rule using eval, anything that may contain local state must be injected. In the handler function, refer to data that will be injected with normal symbols. Provide those symbols as a vector of injection-names. The corresponding data to be injected should be in the same order in injection-data.
(recombine rc rules)
(rmeta)
Expands to (meta (:rule/datom %env))
Expands to (meta (:rule/datom %env))
(rule pattern)
(rule pattern handler-body)
(rule name pattern handler-body)
Create a single rule. There are 2 arities, both with unique behavior.
Arity 1: [pattern] -> identity rule (see below) Arity 2: [pattern body] -> simple replacement rule Arity 3: [name pattern body] -> named simple replacement rule
If the body
of arity 2 is nil/false the rule fails the same as if it had not
matched at all. If the matcher can backtrack and make another match, it may
attempt tho body/dict expression multiple times. Once the expression returns
a valid replacement value or map, the rule will have matched, the replacement
will be made, and no further backtracking will happen.
All pattern variables are bound with the match data in the handler body. For instance an arity 2 rule binding ?a and ?b that returns the sum of those matches:
(rule '(?a [?b]) (+ a b))
The same rule, named:
(rule add-a-to-b0 '(?a [?b]) (+ a b))
Rules may have unquote and spliced unquote in their definitions even if they are defined as normal quoted lists. The functionality is provided by a ruleset in pattern.r3.rewrite/spliced. It allows the following, but note that splices in rule definitions only happen at compile time:
(rule '[(? a ~my-pred) ~@my-seq-of-things]
{:matched a})
A rule with no handler will act as an identity rule, and will always match if the pattern matches. This may be useful within rule lists or for other higher level rule combinators that make use of the rule metadata in the match expression. For example:
(rule '?->expression)
Or the same rule, named must use the 3 arity:
(rule expression '?->expression (success))
Side note, (rule name '?->e)
seems nice, and I tried it but sometimes one may
want (rule symbol :found)
. It's a recipe for weird breakage so I removed it.
Environment args:
A rule can bind arguments from its environment by attaching metadata to the input rule as follows:
(rule set-var ^{:env-args [var-name]} '?form (sub (set ?var-name ?form)))
Rules can also be called with succeed and fail callbacks
(my-rule data env succeed fail)
Create a single rule. There are 2 arities, both with unique behavior. Arity 1: [pattern] -> identity rule (see below) Arity 2: [pattern body] -> simple replacement rule Arity 3: [name pattern body] -> named simple replacement rule If the `body` of arity 2 is nil/false the rule fails the same as if it had not matched at all. If the matcher can backtrack and make another match, it may attempt tho body/dict expression multiple times. Once the expression returns a valid replacement value or map, the rule will have matched, the replacement will be made, and no further backtracking will happen. All pattern variables are bound with the match data in the handler body. For instance an arity 2 rule binding ?a and ?b that returns the sum of those matches: (rule '(?a [?b]) (+ a b)) The same rule, named: (rule add-a-to-b0 '(?a [?b]) (+ a b)) Rules may have unquote and spliced unquote in their definitions even if they are defined as normal quoted lists. The functionality is provided by a ruleset in pattern.r3.rewrite/spliced. It allows the following, but note that splices in rule definitions only happen at *compile time*: (rule '[(? a ~my-pred) ~@my-seq-of-things] {:matched a}) A rule with no handler will act as an identity rule, and will always match if the pattern matches. This may be useful within rule lists or for other higher level rule combinators that make use of the rule metadata in the match expression. For example: (rule '?->expression) Or the same rule, named must use the 3 arity: (rule expression '?->expression (success)) Side note, `(rule name '?->e)` seems nice, and I tried it but sometimes one may want `(rule symbol :found)`. It's a recipe for weird breakage so I removed it. Environment args: A rule can bind arguments from its environment by attaching metadata to the input rule as follows: (rule set-var ^{:env-args [var-name]} '?form (sub (set ?var-name ?form))) Rules can also be called with succeed and fail callbacks (my-rule data env succeed fail)
(rule-list & rules)
Try each of the rules in order top-down.
If any rule succeeds, return that result. If a rule matches but does not succeed, continues down the list.
Each rule can itself be any rule-combinator.
Try each of the rules in order top-down. If any rule succeeds, return that result. If a rule matches but does not succeed, continues down the list. Each rule can itself be any rule-combinator.
(rule-list! & rules)
Like rule-list, but throws an exception if no rule matches.
Each rule can itself be any rule-combinator.
Like rule-list, but throws an exception if no rule matches. Each rule can itself be any rule-combinator.
(rule-name rule)
Get the name or pattern to identify the rule.
Get the name or pattern to identify the rule.
(rule-simplifier & rules)
Run a list of rule combinators repeatedly on all subexpressions until running them makes no further changes.
DEPRECATED, use simplifier
instead. This one does not let you set [[equiv?]].
Run a list of rule combinators repeatedly on all subexpressions until running them makes no further changes. DEPRECATED, use [[simplifier]] instead. This one does not let you set [[equiv?]].
(rule-zipper rc)
Construct a zipper object for rule combinators to enable customization of rules, attaching custom metadata, etc.
Construct a zipper object for rule combinators to enable customization of rules, attaching custom metadata, etc.
(scanner the-rule)
(scanner
{:keys [linear iterate lazy rescan] :or {linear true iterate true} :as opts}
the-rule)
Convert any rule combinator to scan through a list or vector.
The linear and iterate options are true by default.
Linear scanner:
It will do one full pass through the collection, but will not iterate at the top level.
For linear scanner on a rule or rule-list, results from a matching rule are added to the final result and not rescanned. If you want to include the returned result in the data to scan again, set rescan to true.
Not implemented yet: scanner on in-order combinator sets rescan to true by default.
Rescanning scanner:
Specified with the option {:linear false}.
By default, :iterate is set, so the rule will iterate at the top level and rerun through the collection after each successful rule. If iterate is false, the scanner will terminate on the first match.
Setting the :lazy option true will present the rules with the smallest possible matches first. Lazy rescanning scanner works from the back of the collection to the front.
Convert any rule combinator to scan through a list or vector. The linear and iterate options are true by default. Linear scanner: It will do one full pass through the collection, but will not iterate at the top level. For linear scanner on a rule or rule-list, results from a matching rule are added to the final result and not rescanned. If you want to include the returned result in the data to scan again, set rescan to true. Not implemented yet: scanner on in-order combinator sets rescan to true by default. Rescanning scanner: Specified with the option {:linear false}. By default, :iterate is set, so the rule will iterate at the top level and rerun through the collection after each successful rule. If iterate is false, the scanner will terminate on the first match. Setting the :lazy option true will present the rules with the smallest possible matches first. Lazy rescanning scanner works from the back of the collection to the front.
(show-dialect dialect & {:keys [full-names]})
Show the given dialect with all additions and removals of terminals, forms and expressions resolved. This is a useful tool for debugging, especially for dialects that go through many layers of derivation.
Show the given dialect with all additions and removals of terminals, forms and expressions resolved. This is a useful tool for debugging, especially for dialects that go through many layers of derivation.
(show-parse dialect expr)
Show a detailed view of how the dialect parses a given input, even if it parses it successfully.
Show a detailed view of how the dialect parses a given input, even if it parses it successfully.
(simplifier the-rule)
(simplifier equiv? the-rule)
Run the given rule combinator repeatedly depth-first on all subexpressions until running the rule makes no further changes at each level.
Run the given rule combinator repeatedly depth-first on all subexpressions until running the rule makes no further changes at each level.
(spliceable-pattern pattern)
(spliced form)
A function that allows regular quoted lists to be spliced just like
syntax-quoted ones, but only really works within macros because the spliced in
data needs to be evaluated and it doesn't seem possible to do that at runtime
except with [[eval]], which does not use the current evaluation scope. If that
works for you, use eval-spliced
, but usually you will be better off with
either the sub
(recommended), or quo
macros.
This may eventually be useful together with SCI?
A function that allows regular quoted lists to be spliced just like syntax-quoted ones, but only really works within macros because the spliced in data needs to be evaluated and it doesn't seem possible to do that at runtime except with [[eval]], which does not use the current evaluation scope. If that works for you, use [[eval-spliced]], but usually you will be better off with either the [[sub]] (recommended), or [[quo]] macros. This may eventually be useful together with SCI?
(sub form)
(sub f form)
Statically macroexpand substitution patterns expressed exactly like matcher patterns.
This produces what I expect shoud be optimally fast substitutions, but differs
from pattern.substitute/substitute
in that it requires that all substitution patterns
must be bound, and will produce a compilation error if not.
The arity 2 version allows substitutions to be transformed by the supplied function before being inserted if they are marked with <- or wrapped with (?:<- ...)
Statically macroexpand substitution patterns expressed exactly like matcher patterns. This produces what I expect shoud be optimally fast substitutions, but differs from [[pattern.substitute/substitute]] in that it requires that all substitution patterns must be bound, and will produce a compilation error if not. The arity 2 version allows substitutions to be transformed by the supplied function before being inserted if they are marked with <- or wrapped with (?:<- ...)
(sub+ & etc)
Same as sub
. Here for backward compatibility
Same as [[sub]]. Here for backward compatibility
(subm form)
(subm form metadata)
Perform substitution and attach the provided metadata.
If called arity-1, copy the rule's original matching form's metadata onto the resulting form, using rmeta to capture the metadata.
Perform substitution and attach the provided metadata. If called arity-1, copy the rule's original matching form's metadata onto the resulting form, using rmeta to capture the metadata.
(subm+ form)
(subm+ form metadata)
Like subm
but checks that form is an IObj before attaching metadata.
Useful for generated expressions.
Like [[subm]] but checks that form is an IObj before attaching metadata. Useful for generated expressions.
(substitute x)
(substitute x dict)
Substitute matchers in the given pattern with data in the provided dict.
If called with just a pattern (arity 1), returns a function that takes data and an optional failure continuation (fn [dict name pattern]) which must return a list of data to be spliced in place of the pattern.
If using a static pattern, prefer pattern.r3.rewrite/sub
.
Substitute matchers in the given pattern with data in the provided dict. If called with just a pattern (arity 1), returns a function that takes data and an optional failure continuation (fn [dict name pattern]) which must return a *list* of data to be spliced in place of the pattern. If using a static pattern, prefer [[pattern.r3.rewrite/sub]].
(success)
(success x)
(success x env)
Explicitly mark an object as successfully matched when returned from a rule.
The rule will unwrap the data automatically.
Allows rules to return user data directly without failing.
(success false) ;; Allows the rule to return false without failing.
The arity-0 version tells the matcher to use the original input data, also discarding any changes made by patterns that may have recursively matched with the rule.
Explicitly mark an object as successfully matched when returned from a rule. The rule will unwrap the data automatically. Allows rules to return user data directly without failing. (success false) ;; Allows the rule to return false without failing. The arity-0 version tells the matcher to use the original input data, also discarding any changes made by patterns that may have recursively matched with the rule.
(success:env env)
Success but only change the env.
Success but only change the env.
(to-dialect dialect & body)
Wrap a given rule combinator with the dialect. See dialects
Wrap a given rule combinator with the dialect. See `dialects`
(use-post-processor pp & forms)
Set all rules except identity rules in this scope to use the given post processor.
Set all rules except identity rules in this scope to use the given post processor.
(use-post-processors pp ident-rule-pp & forms)
Set all rules, including identity rules in this scope to use the given post processor and identity post processor.
Set all rules, including identity rules in this scope to use the given post processor and identity post processor.
(valid? dialect expr)
Returns true if the expr is valid in the given dialect
Returns true if the expr is valid in the given dialect
(validate dialect expr)
Validates an expression in the given dialect and either returns ok
or a
detailed parse showing all parse errors.
Validates an expression in the given dialect and either returns `ok` or a detailed parse showing all parse errors.
(with-env-args bindings rules)
Attach :env-args metadata to rules to enable convenient binding of env data in the rule handlers.
Attach :env-args metadata to rules to enable convenient binding of env data in the rule handlers.
(with-predicates pred-map & forms)
For all rules defined within this block, matchers in rules with the given abbreviations or names will automatically have the given predicate attached to them.
(with-predicates {'i int?}
(rule '[?i ?i1 ??i*]
;; i, i1, and i* are guaranteed to be integers.
(apply + i i1 i*)))
For all rules defined within this block, matchers in rules with the given abbreviations or names will automatically have the given predicate attached to them. (with-predicates {'i int?} (rule '[?i ?i1 ??i*] ;; i, i1, and i* are guaranteed to be integers. (apply + i i1 i*)))
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close