Liking cljdoc? Tell your friends :D

fr33m0nk.alpakka-kafka.consumer

Akka Stream connector for subscribing to Kafka topics

Akka Stream connector for subscribing to Kafka topics
raw docstring

->at-most-once-sourceclj

(->at-most-once-source consumer-settings topics)

Convenience for 'at-most once delivery' semantics. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#atMostOnceSourceK,V:akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.kafka.javadsl.Consumer.Control]

Convenience for 'at-most once delivery' semantics.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#atMostOnceSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.Subscription):akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->commit-with-metadata-partitioned-sourceclj

(->commit-with-metadata-partitioned-source consumer-settings
                                           topics
                                           consumer-record-metadata-extractor)

The same as #plainPartitionedSource but with offset commit with metadata support. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#commitWithMetadataPartitionedSourceK,V:akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]

The same as #plainPartitionedSource but with offset commit with metadata support.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#commitWithMetadataPartitionedSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.AutoSubscription,metadataFromRecord:java.util.function.Function[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],String]):akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->commit-with-metadata-sourceclj

(->commit-with-metadata-source consumer-settings
                               topics
                               consumer-record-metadata-extractor)

The commitWithMetadataSource makes it possible to add additional metadata (in the form of a string) when an offset is committed based on the record. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#commitWithMetadataSourceK,V:akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]

The commitWithMetadataSource makes it possible to add additional metadata (in the form of a string) when an offset is committed based on the record.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#commitWithMetadataSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.Subscription,metadataFromRecord:java.util.function.Function[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],String]):akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->committable-external-sourceclj

(->committable-external-source consumer
                               topic-partitions
                               group-id
                               commit-timeout-in-nanos)

The same as #plainExternalSource but with offset commit support. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committableExternalSourceK,V:akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]

The same as #plainExternalSource but with offset commit support.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committableExternalSource[K,V](consumer:akka.actor.ActorRef,subscription:akka.kafka.ManualSubscription,groupId:String,commitTimeout:scala.concurrent.duration.FiniteDuration):akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->committable-partitioned-sourceclj

(->committable-partitioned-source consumer-settings topics)

The same as #plainPartitionedSource but with offset commit support. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committablePartitionedSourceK,V:akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]

The same as #plainPartitionedSource but with offset commit support.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committablePartitionedSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.AutoSubscription):akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->committable-sourceclj

(->committable-source consumer-settings topics)

The committableSource makes it possible to commit offset positions to Kafka. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committableSourceK,V:akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]

The committableSource makes it possible to commit offset positions to Kafka.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#committableSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.Subscription):akka.stream.javadsl.Source[akka.kafka.ConsumerMessage.CommittableMessage[K,V],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->plain-partitioned-sourceclj

(->plain-partitioned-source consumer-settings topics)

The plainPartitionedSource is a way to track automatic partition assignment from kafka. https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#plainPartitionedSourceK,V:akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]

The plainPartitionedSource is a way to track automatic partition assignment from kafka.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#plainPartitionedSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.AutoSubscription):akka.stream.javadsl.Source[akka.japi.Pair[org.apache.kafka.common.TopicPartition,akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.NotUsed]],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

->plain-sourceclj

(->plain-source consumer-settings topics)

The plainSource emits ConsumerRecord elements (as received from the underlying KafkaConsumer). It has no support for committing offsets to Kafka. It can be used when the offset is stored externally or with auto-commit (note that auto-commit is by default disabled) https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#plainSourceK,V:akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.kafka.javadsl.Consumer.Control]

The plainSource emits ConsumerRecord elements (as received from the underlying KafkaConsumer). It has no support for committing offsets to Kafka. It can be used when the offset is stored externally or with auto-commit (note that auto-commit is by default disabled)
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#plainSource[K,V](settings:akka.kafka.ConsumerSettings[K,V],subscription:akka.kafka.Subscription):akka.stream.javadsl.Source[org.apache.kafka.clients.consumer.ConsumerRecord[K,V],akka.kafka.javadsl.Consumer.Control]
sourceraw docstring

consumer-settingsclj

(consumer-settings actor-system
                   {:keys [group-id key-deserializer value-deserializer
                           bootstrap-servers auto-offset-reset
                           enable-auto-commit]
                    :or {auto-offset-reset "latest" enable-auto-commit "false"}
                    :as consumer-properties})

Settings for consumers. See akka.kafka.consumer section in reference.conf https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/ConsumerSettings.html

  • Expects consumer-properties to be supplied with kebab-case-keyword keys Full config list can be found in org.apache.kafka.clients.consumer.ConsumerConfig
Settings for consumers. See akka.kafka.consumer section in reference.conf
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/ConsumerSettings.html
- Expects consumer-properties to be supplied with kebab-case-keyword keys
  Full config list can be found in org.apache.kafka.clients.consumer.ConsumerConfig
sourceraw docstring

create-draining-controlclj

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createDrainingControlT:akka.kafka.javadsl.Consumer.DrainingControl[T]

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createDrainingControl[T](c:akka.kafka.javadsl.Consumer.Control,mat:java.util.concurrent.CompletionStage[T]):akka.kafka.javadsl.Consumer.DrainingControl[T]
sourceraw docstring

create-draining-control-with-pairclj

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createDrainingControlT:akka.kafka.javadsl.Consumer.DrainingControl[T]

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createDrainingControl[T](pair:akka.japi.Pair[akka.kafka.javadsl.Consumer.Control,java.util.concurrent.CompletionStage[T]]):akka.kafka.javadsl.Consumer.DrainingControl[T]
sourceraw docstring

create-noop-controlclj

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createNoopControl():akka.kafka.javadsl.Consumer.Control

Combine control and a stream completion signal materialized values into one, so that the stream can be stopped in a controlled way without losing commits
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/javadsl/Consumer$.html#createNoopControl():akka.kafka.javadsl.Consumer.Control
sourceraw docstring

IConsumerMessagecljprotocol

committable-offsetclj

(committable-offset committable-message)
ConsumerMessage is the Output element of committableSource.
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/ConsumerMessage$$CommittableMessage.html#committableOffset:akka.kafka.ConsumerMessage.CommittableOffset

consumer-recordclj

(consumer-record committable-message)

Extracts consumer-record from an instance of CommittableMessage

Extracts consumer-record from an instance of CommittableMessage

partition-offsetclj

(partition-offset transactional-message)
Output element of transactionalSource. The offset is automatically committed as by the Producer
https://doc.akka.io/api/alpakka-kafka/4.0.2/akka/kafka/ConsumerMessage$$TransactionalMessage.html#partitionOffset:akka.kafka.ConsumerMessage.PartitionOffset
source

IConsumerRecordcljprotocol

keyclj

(key committable-message-or-consumer-record)

valueclj

(value committable-message-or-consumer-record)
source

IControlcljprotocol

Materialized value of the consumer Source

Materialized value of the consumer Source

drain-and-shutdownclj

(drain-and-shutdown consumer-control stream-completion executor)

Stop producing messages from the Source, wait for stream completion and shut down the consumer Source so that all consumed messages reach the end of the stream. Failures in stream completion will be propagated, the source will be shut down anyway.

Stop producing messages from the Source, wait for stream completion and shut down the consumer Source so that all consumed messages reach the end of the stream. Failures in stream completion will be propagated, the source will be shut down anyway.

get-metricsclj

(get-metrics consumer-control)

Exposes underlying consumer or producer metrics (as reported by underlying Kafka client library)

Exposes underlying consumer or producer metrics (as reported by underlying Kafka client library)

shutdownclj

(shutdown consumer-control)

Shutdown the consumer Source. It will wait for outstanding offset commit requests before shutting down

Shutdown the consumer Source. It will wait for outstanding offset commit requests before shutting down

shutdown?clj

(shutdown? consumer-control)

Shutdown status. The CompletionStage will be completed when the stage has been shut down and the underlying KafkaConsumer has been closed. Shutdown can be triggered from downstream cancellation, errors, or #shutdown.

Shutdown status. The CompletionStage will be completed when the stage has been shut down and the underlying KafkaConsumer has been closed. Shutdown can be triggered from downstream cancellation, errors, or #shutdown.

stopclj

(stop consumer-control)

Stop producing messages from the Source. This does not stop underlying kafka consumer and does not unsubscribe from any topics/partitions.

Stop producing messages from the Source. This does not stop underlying kafka consumer and does not unsubscribe from any topics/partitions.
sourceraw docstring

topics->subscriptionsclj

(topics->subscriptions topics)
source

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close