(is-crawlable? {:keys [agent-rules]} url user-agent)
Does the given parsed robots.txt permit the given URL to be crawled by the given user-agent?
Does the given parsed robots.txt permit the given URL to be crawled by the given user-agent?
(parse content)
Parses the given string (content of a robots.txt file) into data that can be queried.
Parses the given string (content of a robots.txt file) into data that can be queried.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close