Liking cljdoc? Tell your friends :D

kuro.mojure


*debug*clj


*mode*clj


*tokenizer*clj


->exceptionsclj

(->exceptions coll)

Accepts a clj-token and compounds the exceptions.

Accepts a clj-token and compounds the exceptions.
raw docstring

clj->en-tokenclj

(clj->en-token token)

Accepts a clj-token and translates its features to english.

Accepts a clj-token and translates its features to english.
raw docstring

clj->mojure-tokenclj

(clj->mojure-token token)

Accepts a org.atilika.kuromoji.Token and then transforms its nature according to my opiniated perspective of what makes good Clojurian data.

Accepts a org.atilika.kuromoji.Token and then transforms its nature
according to my opiniated perspective of what makes good Clojurian data.
raw docstring

clj-tokenizeclj

(clj-tokenize s)

Segments text into an ordered seq of clj tokens. Must be used in the context of with-tokenizer.

Segments text into an ordered seq of clj tokens.
Must be used in the context of with-tokenizer.
raw docstring

jp->enclj

(jp->en word)

Converts word in Japanese to English. Defaults to Japanese in case it's not mapped yet.

Converts word in Japanese to English. Defaults to Japanese in case it's not
mapped yet.
raw docstring

jp->en-mappingclj


moji->clj-tokenclj

(moji->clj-token token)

Accepts a org.atilika.kuromoji.Token and creates a Clojure map with the token attributes.

Accepts a org.atilika.kuromoji.Token and creates a Clojure map with the
token attributes.
raw docstring

raw-tokenizeclj

(raw-tokenize s)

Segments text into an ordered seq of org.atilika.kuromoji.Token tokens. Must be used in the context of with-tokenizer.

Segments text into an ordered seq of org.atilika.kuromoji.Token tokens.
Must be used in the context of with-tokenizer.
raw docstring

tokenizeclj

(tokenize s)

Segments text into an ordered seq of kuromojure tokens. Must be used in the context of with-tokenizer.

Segments text into an ordered seq of kuromojure tokens.
Must be used in the context of with-tokenizer.
raw docstring

with-tokenizercljmacro

(with-tokenizer mode & body)

Builds a tokenizer with mode (:normal, :search, :extended) as input, providing a context in which the tokenizer can be used with fns that need it.

Builds a tokenizer with mode (:normal, :search, :extended) as input,
providing a context in which the tokenizer can be used with fns that need it.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close