(is-crawlable? {:keys [agent-rules]} url user-agent)
Does the given parsed robots.txt permit the given URL to be crawled by the given user-agent?
Does the given parsed robots.txt permit the given URL to be crawled by the given user-agent?
(parse content)
Parses the given string (content of a robots.txt file) into data that can be queried.
Parses the given string (content of a robots.txt file) into data that can be queried.
cljdoc builds & hosts documentation for Clojure/Script libraries
Ctrl+k | Jump to recent docs |
← | Move to previous article |
→ | Move to next article |
Ctrl+/ | Jump to the search field |