Liking cljdoc? Tell your friends :D
Clojure only.

franzy.clients.producer.client


make-producerclj

(make-producer config)
(make-producer config options)
(make-producer config key-serializer value-serializer)
(make-producer config key-serializer value-serializer options)

Inputs: ([config :- ps/ProducerConfig] [config :- ps/ProducerConfig options :- (s/maybe ps/ProducerOptions)] [config :- ps/ProducerConfig key-serializer :- Serializer value-serializer :- Serializer] [config :- ps/ProducerConfig key-serializer :- Serializer value-serializer :- Serializer options :- (s/maybe ps/ProducerOptions)]) Returns: FranzProducer

Create a Kafka Producer from a configuration, with optional serializers and optional producer options. If a callback is given, call it when stopping the consumer. If deserializers are provided, use them, otherwise expect deserializers via class name in the config map.

This producer implementation wraps the Kafka Java Producer API. It provides a Clojure (ish) wrapper, with Clojure data structures to/from Kafka, and implements various protocols to allow more specialized consumers following this implementation. If you prefer a lower-level implementation or wish to test your producer, you may wish to browse this implementation and implement one or all the protocols provided.

For per function documentation, please see the source for extensive comments, usage examples, etc.

Note: This implementation stresses a reasonable compromise between raw performance, extensibility, and usability, all things considered as:

  1. A wrapper
  2. Clojure

Producer options serve the following purposes:

  • Avoid repeated/inconvenient passing of defaults to various methods requiring options such as timeouts. Many producers do not need per-call options.
  • Long-term extensibility as more features are added to this client, mitigating signature changes and excessive arities
  • Cheaper lookups and smaller memory footprint as the options are created in final form as records.
  • Dynamic construction of producer options via stream processors, back-off logic, etc.
  • Reduction in garbage collection for producers that do not need per-call options. Overall, less intermediate maps and reified objects.
  • Avoid slow memory allocations for the aforementioned cases.
  • Mitigate Kafka Java API changes. The API has often been in flux and sometimes it is necessary for extra options to handle weirdness from Java API bugs.

Note: Consumer options are distinct from the Kafka Consumer Configuration.

Inputs: ([config :- ps/ProducerConfig] [config :- ps/ProducerConfig options :- (s/maybe ps/ProducerOptions)] [config :- ps/ProducerConfig key-serializer :- Serializer value-serializer :- Serializer] [config :- ps/ProducerConfig key-serializer :- Serializer value-serializer :- Serializer options :- (s/maybe ps/ProducerOptions)])
Returns: FranzProducer

Create a Kafka Producer from a configuration, with optional serializers and optional producer options.
 If a callback is given, call it when stopping the consumer.
 If deserializers are provided, use them, otherwise expect deserializers via class name in the config map.

 This producer implementation wraps the Kafka Java Producer API.
 It provides a Clojure (ish) wrapper, with Clojure data structures to/from Kafka,
 and implements various protocols to allow more specialized consumers following this implementation.
 If you prefer a lower-level implementation or wish to test your producer, you may wish to browse this implementation
 and implement one or all the protocols provided.

 For per function documentation, please see the source for extensive comments, usage examples, etc.

 > Note: This implementation stresses a reasonable compromise between raw performance, extensibility, and usability, all things considered as:

 1. A wrapper
 2. Clojure

 Producer options serve the following purposes:

 * Avoid repeated/inconvenient passing of defaults to various methods requiring options such as timeouts. Many producers do not need per-call options.
 * Long-term extensibility as more features are added to this client, mitigating signature changes and excessive arities
 * Cheaper lookups and smaller memory footprint as the options are created in final form as records.
 * Dynamic construction of producer options via stream processors, back-off logic, etc.
 * Reduction in garbage collection for producers that do not need per-call options. Overall, less intermediate maps and reified objects.
 * Avoid slow memory allocations for the aforementioned cases.
 * Mitigate Kafka Java API changes. The API has often been in flux and sometimes it is necessary for extra options to handle weirdness from Java API bugs.

 > Note: Consumer options are distinct from the Kafka Consumer Configuration.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close