Basic interfaces and protocols, utility functions
Basic interfaces and protocols, utility functions
Chunked collections offer chonky chunks of elements which can be reduced over. There should be enough chunks to offer meaningful parallelism, and they should be small enough that the system doesn't get bogged down forever working on a chunk--but they should also be big enough that we spend more time processing a chunk than doing bookkeeping and combining work across chunks.
Chunked collections offer chonky chunks of elements which can be reduced over. There should be enough chunks to offer meaningful parallelism, and they should be small enough that the system doesn't get bogged down *forever* working on a chunk--but they should also be big enough that we spend more time processing a chunk than doing bookkeeping and combining work across chunks.
(chunks this)
Returns a collection of chunks, each a reducible.
Returns a collection of chunks, each a reducible.
(chunked chunks)
(chunked n coll)
Takes a collection of chunks and creates a Chunked wrapper around them which, when asked for chunks, returns exactly that collection. Or, with two args, slices up a collection (e.g. a vector) into chunks of the given size, and makes a Chunked out of them.
Takes a collection of chunks and creates a Chunked wrapper around them which, when asked for chunks, returns exactly that collection. Or, with two args, slices up a collection (e.g. a vector) into chunks of the given size, and makes a Chunked out of them.
(chunked-vector-chunk-id count indices target)
Takes an array of starting indices for each chunk and an index in a chunked vector; returns the chunk ID for that index.
Takes an array of starting indices for each chunk and an index in a chunked vector; returns the chunk ID for that index.
How big is a chunk, by default?
How big is a chunk, by default?
(soft-chunked-vector count starting-indices load-nth)
(soft-chunked-vector name count starting-indices load-nth)
Makes a vector divided into several chunks, each stored in a soft reference, so the collection can be much larger than RAM. Takes an overall count for the vector's elements, and a list of starting indices--the first being 0, the second being the index of the first element in the second chunk, the third being the index of the first element in the third chunk, and so on. Takes a function load-nth which loads a chunk from disk. The resulting collection is also Chunkable, so it works with history.fold collections.
Makes a vector divided into several chunks, each stored in a soft reference, so the collection can be much larger than RAM. Takes an overall count for the vector's elements, and a list of starting indices--the first being 0, the second being the index of the first element in the second chunk, the third being the index of the first element in the third chunk, and so on. Takes a function load-nth which loads a chunk from disk. The resulting collection is also Chunkable, so it works with history.fold collections.
(soft-vector n load-nth)
(soft-vector name n load-nth)
Takes a number of elements n and a function (load-nth i)
which takes an
index in [0, n) and returns a value at that index. Returns a counted,
indexed, seqable, sequential, chunkable collection whose elements are the
values [(load-nth 0), (load-nth 1), ... (load-nth (dec n))]. Values are
loaded on-demand and cached in soft references. Use this to represent chunked
collections bigger than memory. The chunks are the vector itself.
Takes an optional name for debugging.
Takes a number of elements n and a function `(load-nth i)` which takes an index in [0, n) and returns a value at that index. Returns a counted, indexed, seqable, sequential, chunkable collection whose elements are the values [(load-nth 0), (load-nth 1), ... (load-nth (dec n))]. Values are loaded on-demand and cached in soft references. Use this to represent chunked collections bigger than memory. The chunks are the vector itself. Takes an optional name for debugging.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close