Liking cljdoc? Tell your friends :D

sparkplug.context

Functions for working with and creating Spark contexts.

Functions for working with and creating Spark contexts.

add-file!clj

(add-file! spark-context path)
(add-file! spark-context path recursive?)

Add a file to be downloaded with this Spark job on every node.

Add a file to be downloaded with this Spark job on every node.
raw docstring

add-jar!clj

(add-jar! spark-context path)

Adds a JAR dependency for all tasks to be executed on this SparkContext in the future.

Adds a JAR dependency for all tasks to be executed on this SparkContext in
the future.
raw docstring

cancel-all-jobs!clj

(cancel-all-jobs! spark-context)

Cancel all jobs that have been scheduled or are running.

Cancel all jobs that have been scheduled or are running.
raw docstring

cancel-job-group!clj

(cancel-job-group! spark-context group-id)

Cancel active jobs for the specified group.

See set-job-group! for more information.

Cancel active jobs for the specified group.

See `set-job-group!` for more information.
raw docstring

clear-job-group!clj

(clear-job-group! spark-context)

Clear the current thread's job group ID and its description.

Clear the current thread's job group ID and its description.
raw docstring

configclj

(config spark-context)

Return the Spark configuration used for the given context.

Return the Spark configuration used for the given context.
raw docstring

get-local-propertyclj

(get-local-property spark-context k)

Get a local property set for this thread, or null if not set.

Get a local property set for this thread, or null if not set.
raw docstring

infoclj

(info spark-context)

Build a map of information about the Spark context.

Build a map of information about the Spark context.
raw docstring

persistent-rddsclj

(persistent-rdds spark-context)

Return a Java map of JavaRDDs that have marked themselves as persistent via a cache! call.

Return a Java map of JavaRDDs that have marked themselves as persistent via
a `cache!` call.
raw docstring

set-checkpoint-dir!clj

(set-checkpoint-dir! spark-context path)

Set the directory under which RDDs are going to be checkpointed.

Set the directory under which RDDs are going to be checkpointed.
raw docstring

set-job-description!clj

(set-job-description! spark-context description)

Set a human readable description of the current job.

Set a human readable description of the current job.
raw docstring

set-job-group!clj

(set-job-group! spark-context group-id description)
(set-job-group! spark-context group-id description interrupt?)

Assign a group ID to all the jobs started by this thread until the group ID is set to a different value or cleared.

Often, a unit of execution in an application consists of multiple Spark actions or jobs. Application programmers can use this method to group all those jobs together and give a group description. Once set, the Spark web UI will associate such jobs with this group.

The application can later use cancel-job-group! to cancel all running jobs in this group. If interrupt? is set to true for the job group, then job cancellation will result in the job's executor threads being interrupted.

Assign a group ID to all the jobs started by this thread until the group ID
is set to a different value or cleared.

Often, a unit of execution in an application consists of multiple Spark
actions or jobs. Application programmers can use this method to group all
those jobs together and give a group description. Once set, the Spark web UI
will associate such jobs with this group.

The application can later use `cancel-job-group!` to cancel all running jobs
in this group. If `interrupt?` is set to true for the job group, then job
cancellation will result in the job's executor threads being interrupted.
raw docstring

set-local-property!clj

(set-local-property! spark-context k v)

Set a local property that affects jobs submitted from this thread, and all child threads, such as the Spark fair scheduler pool.

Set a local property that affects jobs submitted from this thread, and all
child threads, such as the Spark fair scheduler pool.
raw docstring

set-log-level!clj

(set-log-level! spark-context level)

Control the Spark application's logging level.

Control the Spark application's logging level.
raw docstring

spark-contextclj

(spark-context conf)
(spark-context master app-name)

Create a new spark context which takes its settings from the given configuration object.

Create a new spark context which takes its settings from the given
configuration object.
raw docstring

stop!clj

(stop! spark-context)

Shut down the Spark context.

Shut down the Spark context.
raw docstring

with-contextcljmacro

(with-context binding-vec & body)

Evaluate body within a new Spark context by constructing one from the given expression. The context is stopped after evaluation is complete.

Evaluate `body` within a new Spark context by constructing one from the
given expression. The context is stopped after evaluation is complete.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close