Liking cljdoc? Tell your friends :D

happygapi.dataflow.projects

Dataflow API: projects. Manages Google Cloud Dataflow projects on Google Cloud Platform. See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects

Dataflow API: projects.
Manages Google Cloud Dataflow projects on Google Cloud Platform.
See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects
raw docstring

jobs-aggregated$clj

(jobs-aggregated$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated

Required parameters: projectId

Optional parameters: filter, location, pageToken, pageSize, view List the jobs of a project across all regions.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated

Required parameters: projectId

Optional parameters: filter, location, pageToken, pageSize, view
List the jobs of a project across all regions.
raw docstring

jobs-create$clj

(jobs-create$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create

Required parameters: projectId

Optional parameters: location, replaceJobId, view

Body:

{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:originalPipelineTransform [TransformSummary], :displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}

Creates a Cloud Dataflow job.

To create a job, we recommend using projects.locations.jobs.create with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.create is not recommended, as your job will always start in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create

Required parameters: projectId

Optional parameters: location, replaceJobId, view

Body: 

{:labels {},
 :stepsLocation string,
 :executionInfo {:stages {}},
 :jobMetadata {:datastoreDetails [DatastoreIODetails],
               :sdkVersion SdkVersion,
               :fileDetails [FileIODetails],
               :bigqueryDetails [BigQueryIODetails],
               :pubsubDetails [PubSubIODetails],
               :bigTableDetails [BigTableIODetails],
               :spannerDetails [SpannerIODetails]},
 :clientRequestId string,
 :startTime string,
 :stageStates [{:executionStageName string,
                :currentStateTime string,
                :executionStageState string}],
 :name string,
 :steps [{:properties {}, :name string, :kind string}],
 :createTime string,
 :currentStateTime string,
 :type string,
 :transformNameMapping {},
 :replaceJobId string,
 :pipelineDescription {:originalPipelineTransform [TransformSummary],
                       :displayData [DisplayData],
                       :executionPipelineStage [ExecutionStageSummary]},
 :replacedByJobId string,
 :currentState string,
 :tempFiles [string],
 :id string,
 :createdFromSnapshotId string,
 :environment {:flexResourceSchedulingGoal string,
               :internalExperiments {},
               :experiments [string],
               :workerRegion string,
               :sdkPipelineOptions {},
               :serviceKmsKeyName string,
               :tempStoragePrefix string,
               :serviceAccountEmail string,
               :clusterManagerApiService string,
               :userAgent {},
               :workerZone string,
               :version {},
               :workerPools [WorkerPool],
               :dataset string},
 :projectId string,
 :requestedState string,
 :location string}

Creates a Cloud Dataflow job.

To create a job, we recommend using `projects.locations.jobs.create` with a
[regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.create` is not recommended, as your job will always start
in `us-central1`.
raw docstring

jobs-debug-getConfig$clj

(jobs-debug-getConfig$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig

Required parameters: projectId, jobId

Optional parameters: none

Body:

{:location string, :componentId string, :workerId string}

Get encoded debug configuration for component. Not cacheable.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig

Required parameters: projectId, jobId

Optional parameters: none

Body: 

{:location string, :componentId string, :workerId string}

Get encoded debug configuration for component. Not cacheable.
raw docstring

jobs-debug-sendCapture$clj

(jobs-debug-sendCapture$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture

Required parameters: projectId, jobId

Optional parameters: none

Body:

{:componentId string, :workerId string, :location string, :data string}

Send encoded debug capture data for component.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture

Required parameters: projectId, jobId

Optional parameters: none

Body: 

{:componentId string,
 :workerId string,
 :location string,
 :data string}

Send encoded debug capture data for component.
raw docstring

jobs-get$clj

(jobs-get$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get

Required parameters: projectId, jobId

Optional parameters: location, view Gets the state of the specified Cloud Dataflow job.

To get the state of a job, we recommend using projects.locations.jobs.get with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.get is not recommended, as you can only get the state of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get

Required parameters: projectId, jobId

Optional parameters: location, view
Gets the state of the specified Cloud Dataflow job.

To get the state of a job, we recommend using `projects.locations.jobs.get`
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.get` is not recommended, as you can only get the state of
jobs that are running in `us-central1`.
raw docstring

jobs-getMetrics$clj

(jobs-getMetrics$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics

Required parameters: projectId, jobId

Optional parameters: startTime, location Request the job status.

To request the status of a job, we recommend using projects.locations.jobs.getMetrics with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.getMetrics is not recommended, as you can only request the status of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics

Required parameters: projectId, jobId

Optional parameters: startTime, location
Request the job status.

To request the status of a job, we recommend using
`projects.locations.jobs.getMetrics` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.getMetrics` is not recommended, as you can only request the
status of jobs that are running in `us-central1`.
raw docstring

jobs-list$clj

(jobs-list$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list

Required parameters: projectId

Optional parameters: pageToken, pageSize, view, filter, location List the jobs of a project.

To list the jobs of a project in a region, we recommend using projects.locations.jobs.get with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use projects.jobs.aggregated. Using projects.jobs.list is not recommended, as you can only get the list of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list

Required parameters: projectId

Optional parameters: pageToken, pageSize, view, filter, location
List the jobs of a project.

To list the jobs of a project in a region, we recommend using
`projects.locations.jobs.get` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To
list the all jobs across all regions, use `projects.jobs.aggregated`. Using
`projects.jobs.list` is not recommended, as you can only get the list of
jobs that are running in `us-central1`.
raw docstring

jobs-messages-list$clj

(jobs-messages-list$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list

Required parameters: jobId, projectId

Optional parameters: endTime, location, startTime, pageToken, pageSize, minimumImportance Request the job status.

To request the status of a job, we recommend using projects.locations.jobs.messages.list with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.messages.list is not recommended, as you can only request the status of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list

Required parameters: jobId, projectId

Optional parameters: endTime, location, startTime, pageToken, pageSize, minimumImportance
Request the job status.

To request the status of a job, we recommend using
`projects.locations.jobs.messages.list` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.messages.list` is not recommended, as you can only request
the status of jobs that are running in `us-central1`.
raw docstring

jobs-update$clj

(jobs-update$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update

Required parameters: projectId, jobId

Optional parameters: location

Body:

{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:originalPipelineTransform [TransformSummary], :displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}

Updates the state of an existing Cloud Dataflow job.

To update the state of an existing job, we recommend using projects.locations.jobs.update with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.update is not recommended, as you can only update the state of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update

Required parameters: projectId, jobId

Optional parameters: location

Body: 

{:labels {},
 :stepsLocation string,
 :executionInfo {:stages {}},
 :jobMetadata {:datastoreDetails [DatastoreIODetails],
               :sdkVersion SdkVersion,
               :fileDetails [FileIODetails],
               :bigqueryDetails [BigQueryIODetails],
               :pubsubDetails [PubSubIODetails],
               :bigTableDetails [BigTableIODetails],
               :spannerDetails [SpannerIODetails]},
 :clientRequestId string,
 :startTime string,
 :stageStates [{:executionStageName string,
                :currentStateTime string,
                :executionStageState string}],
 :name string,
 :steps [{:properties {}, :name string, :kind string}],
 :createTime string,
 :currentStateTime string,
 :type string,
 :transformNameMapping {},
 :replaceJobId string,
 :pipelineDescription {:originalPipelineTransform [TransformSummary],
                       :displayData [DisplayData],
                       :executionPipelineStage [ExecutionStageSummary]},
 :replacedByJobId string,
 :currentState string,
 :tempFiles [string],
 :id string,
 :createdFromSnapshotId string,
 :environment {:flexResourceSchedulingGoal string,
               :internalExperiments {},
               :experiments [string],
               :workerRegion string,
               :sdkPipelineOptions {},
               :serviceKmsKeyName string,
               :tempStoragePrefix string,
               :serviceAccountEmail string,
               :clusterManagerApiService string,
               :userAgent {},
               :workerZone string,
               :version {},
               :workerPools [WorkerPool],
               :dataset string},
 :projectId string,
 :requestedState string,
 :location string}

Updates the state of an existing Cloud Dataflow job.

To update the state of an existing job, we recommend using
`projects.locations.jobs.update` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.update` is not recommended, as you can only update the state
of jobs that are running in `us-central1`.
raw docstring

jobs-workItems-lease$clj

(jobs-workItems-lease$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease

Required parameters: projectId, jobId

Optional parameters: none

Body:

{:requestedLeaseDuration string, :currentWorkerTime string, :location string, :workItemTypes [string], :unifiedWorkerRequest {}, :workerId string, :workerCapabilities [string]}

Leases a dataflow WorkItem to run.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease

Required parameters: projectId, jobId

Optional parameters: none

Body: 

{:requestedLeaseDuration string,
 :currentWorkerTime string,
 :location string,
 :workItemTypes [string],
 :unifiedWorkerRequest {},
 :workerId string,
 :workerCapabilities [string]}

Leases a dataflow WorkItem to run.
raw docstring

jobs-workItems-reportStatus$clj

(jobs-workItems-reportStatus$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus

Required parameters: projectId, jobId

Optional parameters: none

Body:

{:workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}, :workerId string, :currentWorkerTime string, :location string}

Reports the status of dataflow WorkItems leased by a worker.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus

Required parameters: projectId, jobId

Optional parameters: none

Body: 

{:workItemStatuses [{:stopPosition Position,
                     :sourceFork SourceFork,
                     :sourceOperationResponse SourceOperationResponse,
                     :errors [Status],
                     :reportedProgress ApproximateReportedProgress,
                     :completed boolean,
                     :workItemId string,
                     :reportIndex string,
                     :totalThrottlerWaitTimeSeconds number,
                     :metricUpdates [MetricUpdate],
                     :progress ApproximateProgress,
                     :dynamicSourceSplit DynamicSourceSplit,
                     :counterUpdates [CounterUpdate],
                     :requestedLeaseDuration string}],
 :unifiedWorkerRequest {},
 :workerId string,
 :currentWorkerTime string,
 :location string}

Reports the status of dataflow WorkItems leased by a worker.
raw docstring

locations-flexTemplates-launch$clj

(locations-flexTemplates-launch$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch

Required parameters: projectId, location

Optional parameters: none

Body:

{:validateOnly boolean, :launchParameter {:containerSpec ContainerSpec, :containerSpecGcsPath string, :parameters {}, :jobName string}}

Launch a job with a FlexTemplate.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch

Required parameters: projectId, location

Optional parameters: none

Body: 

{:validateOnly boolean,
 :launchParameter {:containerSpec ContainerSpec,
                   :containerSpecGcsPath string,
                   :parameters {},
                   :jobName string}}

Launch a job with a FlexTemplate.
raw docstring

locations-jobs-create$clj

(locations-jobs-create$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create

Required parameters: projectId, location

Optional parameters: view, replaceJobId

Body:

{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:originalPipelineTransform [TransformSummary], :displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}

Creates a Cloud Dataflow job.

To create a job, we recommend using projects.locations.jobs.create with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.create is not recommended, as your job will always start in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create

Required parameters: projectId, location

Optional parameters: view, replaceJobId

Body: 

{:labels {},
 :stepsLocation string,
 :executionInfo {:stages {}},
 :jobMetadata {:datastoreDetails [DatastoreIODetails],
               :sdkVersion SdkVersion,
               :fileDetails [FileIODetails],
               :bigqueryDetails [BigQueryIODetails],
               :pubsubDetails [PubSubIODetails],
               :bigTableDetails [BigTableIODetails],
               :spannerDetails [SpannerIODetails]},
 :clientRequestId string,
 :startTime string,
 :stageStates [{:executionStageName string,
                :currentStateTime string,
                :executionStageState string}],
 :name string,
 :steps [{:properties {}, :name string, :kind string}],
 :createTime string,
 :currentStateTime string,
 :type string,
 :transformNameMapping {},
 :replaceJobId string,
 :pipelineDescription {:originalPipelineTransform [TransformSummary],
                       :displayData [DisplayData],
                       :executionPipelineStage [ExecutionStageSummary]},
 :replacedByJobId string,
 :currentState string,
 :tempFiles [string],
 :id string,
 :createdFromSnapshotId string,
 :environment {:flexResourceSchedulingGoal string,
               :internalExperiments {},
               :experiments [string],
               :workerRegion string,
               :sdkPipelineOptions {},
               :serviceKmsKeyName string,
               :tempStoragePrefix string,
               :serviceAccountEmail string,
               :clusterManagerApiService string,
               :userAgent {},
               :workerZone string,
               :version {},
               :workerPools [WorkerPool],
               :dataset string},
 :projectId string,
 :requestedState string,
 :location string}

Creates a Cloud Dataflow job.

To create a job, we recommend using `projects.locations.jobs.create` with a
[regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.create` is not recommended, as your job will always start
in `us-central1`.
raw docstring

locations-jobs-debug-getConfig$clj

(locations-jobs-debug-getConfig$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig

Required parameters: projectId, jobId, location

Optional parameters: none

Body:

{:location string, :componentId string, :workerId string}

Get encoded debug configuration for component. Not cacheable.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig

Required parameters: projectId, jobId, location

Optional parameters: none

Body: 

{:location string, :componentId string, :workerId string}

Get encoded debug configuration for component. Not cacheable.
raw docstring

locations-jobs-debug-sendCapture$clj

(locations-jobs-debug-sendCapture$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture

Required parameters: location, projectId, jobId

Optional parameters: none

Body:

{:componentId string, :workerId string, :location string, :data string}

Send encoded debug capture data for component.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture

Required parameters: location, projectId, jobId

Optional parameters: none

Body: 

{:componentId string,
 :workerId string,
 :location string,
 :data string}

Send encoded debug capture data for component.
raw docstring

locations-jobs-get$clj

(locations-jobs-get$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get

Required parameters: location, projectId, jobId

Optional parameters: view Gets the state of the specified Cloud Dataflow job.

To get the state of a job, we recommend using projects.locations.jobs.get with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.get is not recommended, as you can only get the state of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get

Required parameters: location, projectId, jobId

Optional parameters: view
Gets the state of the specified Cloud Dataflow job.

To get the state of a job, we recommend using `projects.locations.jobs.get`
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.get` is not recommended, as you can only get the state of
jobs that are running in `us-central1`.
raw docstring

locations-jobs-getMetrics$clj

(locations-jobs-getMetrics$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics

Required parameters: location, projectId, jobId

Optional parameters: startTime Request the job status.

To request the status of a job, we recommend using projects.locations.jobs.getMetrics with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.getMetrics is not recommended, as you can only request the status of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics

Required parameters: location, projectId, jobId

Optional parameters: startTime
Request the job status.

To request the status of a job, we recommend using
`projects.locations.jobs.getMetrics` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.getMetrics` is not recommended, as you can only request the
status of jobs that are running in `us-central1`.
raw docstring

locations-jobs-list$clj

(locations-jobs-list$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list

Required parameters: projectId, location

Optional parameters: filter, pageToken, pageSize, view List the jobs of a project.

To list the jobs of a project in a region, we recommend using projects.locations.jobs.get with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use projects.jobs.aggregated. Using projects.jobs.list is not recommended, as you can only get the list of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list

Required parameters: projectId, location

Optional parameters: filter, pageToken, pageSize, view
List the jobs of a project.

To list the jobs of a project in a region, we recommend using
`projects.locations.jobs.get` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To
list the all jobs across all regions, use `projects.jobs.aggregated`. Using
`projects.jobs.list` is not recommended, as you can only get the list of
jobs that are running in `us-central1`.
raw docstring

locations-jobs-messages-list$clj

(locations-jobs-messages-list$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list

Required parameters: projectId, jobId, location

Optional parameters: minimumImportance, endTime, pageToken, startTime, pageSize Request the job status.

To request the status of a job, we recommend using projects.locations.jobs.messages.list with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.messages.list is not recommended, as you can only request the status of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list

Required parameters: projectId, jobId, location

Optional parameters: minimumImportance, endTime, pageToken, startTime, pageSize
Request the job status.

To request the status of a job, we recommend using
`projects.locations.jobs.messages.list` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.messages.list` is not recommended, as you can only request
the status of jobs that are running in `us-central1`.
raw docstring

locations-jobs-update$clj

(locations-jobs-update$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update

Required parameters: projectId, jobId, location

Optional parameters: none

Body:

{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:originalPipelineTransform [TransformSummary], :displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}

Updates the state of an existing Cloud Dataflow job.

To update the state of an existing job, we recommend using projects.locations.jobs.update with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.update is not recommended, as you can only update the state of jobs that are running in us-central1.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update

Required parameters: projectId, jobId, location

Optional parameters: none

Body: 

{:labels {},
 :stepsLocation string,
 :executionInfo {:stages {}},
 :jobMetadata {:datastoreDetails [DatastoreIODetails],
               :sdkVersion SdkVersion,
               :fileDetails [FileIODetails],
               :bigqueryDetails [BigQueryIODetails],
               :pubsubDetails [PubSubIODetails],
               :bigTableDetails [BigTableIODetails],
               :spannerDetails [SpannerIODetails]},
 :clientRequestId string,
 :startTime string,
 :stageStates [{:executionStageName string,
                :currentStateTime string,
                :executionStageState string}],
 :name string,
 :steps [{:properties {}, :name string, :kind string}],
 :createTime string,
 :currentStateTime string,
 :type string,
 :transformNameMapping {},
 :replaceJobId string,
 :pipelineDescription {:originalPipelineTransform [TransformSummary],
                       :displayData [DisplayData],
                       :executionPipelineStage [ExecutionStageSummary]},
 :replacedByJobId string,
 :currentState string,
 :tempFiles [string],
 :id string,
 :createdFromSnapshotId string,
 :environment {:flexResourceSchedulingGoal string,
               :internalExperiments {},
               :experiments [string],
               :workerRegion string,
               :sdkPipelineOptions {},
               :serviceKmsKeyName string,
               :tempStoragePrefix string,
               :serviceAccountEmail string,
               :clusterManagerApiService string,
               :userAgent {},
               :workerZone string,
               :version {},
               :workerPools [WorkerPool],
               :dataset string},
 :projectId string,
 :requestedState string,
 :location string}

Updates the state of an existing Cloud Dataflow job.

To update the state of an existing job, we recommend using
`projects.locations.jobs.update` with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
`projects.jobs.update` is not recommended, as you can only update the state
of jobs that are running in `us-central1`.
raw docstring

locations-jobs-workItems-lease$clj

(locations-jobs-workItems-lease$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease

Required parameters: location, projectId, jobId

Optional parameters: none

Body:

{:requestedLeaseDuration string, :currentWorkerTime string, :location string, :workItemTypes [string], :unifiedWorkerRequest {}, :workerId string, :workerCapabilities [string]}

Leases a dataflow WorkItem to run.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease

Required parameters: location, projectId, jobId

Optional parameters: none

Body: 

{:requestedLeaseDuration string,
 :currentWorkerTime string,
 :location string,
 :workItemTypes [string],
 :unifiedWorkerRequest {},
 :workerId string,
 :workerCapabilities [string]}

Leases a dataflow WorkItem to run.
raw docstring

locations-jobs-workItems-reportStatus$clj

(locations-jobs-workItems-reportStatus$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/reportStatus

Required parameters: location, projectId, jobId

Optional parameters: none

Body:

{:workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}, :workerId string, :currentWorkerTime string, :location string}

Reports the status of dataflow WorkItems leased by a worker.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/reportStatus

Required parameters: location, projectId, jobId

Optional parameters: none

Body: 

{:workItemStatuses [{:stopPosition Position,
                     :sourceFork SourceFork,
                     :sourceOperationResponse SourceOperationResponse,
                     :errors [Status],
                     :reportedProgress ApproximateReportedProgress,
                     :completed boolean,
                     :workItemId string,
                     :reportIndex string,
                     :totalThrottlerWaitTimeSeconds number,
                     :metricUpdates [MetricUpdate],
                     :progress ApproximateProgress,
                     :dynamicSourceSplit DynamicSourceSplit,
                     :counterUpdates [CounterUpdate],
                     :requestedLeaseDuration string}],
 :unifiedWorkerRequest {},
 :workerId string,
 :currentWorkerTime string,
 :location string}

Reports the status of dataflow WorkItems leased by a worker.
raw docstring

locations-sql-validate$clj

(locations-sql-validate$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate

Required parameters: projectId, location

Optional parameters: query Validates a GoogleSQL query for Cloud Dataflow syntax. Will always confirm the given query parses correctly, and if able to look up schema information from DataCatalog, will validate that the query analyzes properly as well.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate

Required parameters: projectId, location

Optional parameters: query
Validates a GoogleSQL query for Cloud Dataflow syntax. Will always
confirm the given query parses correctly, and if able to look up
schema information from DataCatalog, will validate that the query
analyzes properly as well.
raw docstring

locations-templates-create$clj

(locations-templates-create$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create

Required parameters: location, projectId

Optional parameters: none

Body:

{:jobName string, :gcsPath string, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}}

Creates a Cloud Dataflow job from a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create

Required parameters: location, projectId

Optional parameters: none

Body: 

{:jobName string,
 :gcsPath string,
 :environment {:maxWorkers integer,
               :workerRegion string,
               :additionalExperiments [string],
               :zone string,
               :machineType string,
               :tempLocation string,
               :numWorkers integer,
               :serviceAccountEmail string,
               :bypassTempDirValidation boolean,
               :ipConfiguration string,
               :kmsKeyName string,
               :network string,
               :workerZone string,
               :additionalUserLabels {},
               :subnetwork string},
 :location string,
 :parameters {}}

Creates a Cloud Dataflow job from a template.
raw docstring

locations-templates-get$clj

(locations-templates-get$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get

Required parameters: projectId, location

Optional parameters: view, gcsPath Get the template associated with a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get

Required parameters: projectId, location

Optional parameters: view, gcsPath
Get the template associated with a template.
raw docstring

locations-templates-launch$clj

(locations-templates-launch$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch

Required parameters: projectId, location

Optional parameters: gcsPath, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation, validateOnly

Body:

{:parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean}

Launch a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch

Required parameters: projectId, location

Optional parameters: gcsPath, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation, validateOnly

Body: 

{:parameters {},
 :jobName string,
 :transformNameMapping {},
 :environment {:maxWorkers integer,
               :workerRegion string,
               :additionalExperiments [string],
               :zone string,
               :machineType string,
               :tempLocation string,
               :numWorkers integer,
               :serviceAccountEmail string,
               :bypassTempDirValidation boolean,
               :ipConfiguration string,
               :kmsKeyName string,
               :network string,
               :workerZone string,
               :additionalUserLabels {},
               :subnetwork string},
 :update boolean}

Launch a template.
raw docstring

locations-workerMessages$clj

(locations-workerMessages$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages

Required parameters: location, projectId

Optional parameters: none

Body:

{:workerMessages [{:workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMessageCode WorkerMessageCode, :workerMetrics ResourceUtilizationReport, :labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent}], :location string}

Send a worker_message to the service.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages

Required parameters: location, projectId

Optional parameters: none

Body: 

{:workerMessages [{:workerShutdownNotice WorkerShutdownNotice,
                   :workerHealthReport WorkerHealthReport,
                   :workerMessageCode WorkerMessageCode,
                   :workerMetrics ResourceUtilizationReport,
                   :labels {},
                   :time string,
                   :workerLifecycleEvent WorkerLifecycleEvent}],
 :location string}

Send a worker_message to the service.
raw docstring

templates-create$clj

(templates-create$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create

Required parameters: projectId

Optional parameters: none

Body:

{:jobName string, :gcsPath string, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}}

Creates a Cloud Dataflow job from a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create

Required parameters: projectId

Optional parameters: none

Body: 

{:jobName string,
 :gcsPath string,
 :environment {:maxWorkers integer,
               :workerRegion string,
               :additionalExperiments [string],
               :zone string,
               :machineType string,
               :tempLocation string,
               :numWorkers integer,
               :serviceAccountEmail string,
               :bypassTempDirValidation boolean,
               :ipConfiguration string,
               :kmsKeyName string,
               :network string,
               :workerZone string,
               :additionalUserLabels {},
               :subnetwork string},
 :location string,
 :parameters {}}

Creates a Cloud Dataflow job from a template.
raw docstring

templates-get$clj

(templates-get$ auth args)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get

Required parameters: projectId

Optional parameters: location, view, gcsPath Get the template associated with a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get

Required parameters: projectId

Optional parameters: location, view, gcsPath
Get the template associated with a template.
raw docstring

templates-launch$clj

(templates-launch$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch

Required parameters: projectId

Optional parameters: validateOnly, gcsPath, location, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation

Body:

{:parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean}

Launch a template.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch

Required parameters: projectId

Optional parameters: validateOnly, gcsPath, location, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation

Body: 

{:parameters {},
 :jobName string,
 :transformNameMapping {},
 :environment {:maxWorkers integer,
               :workerRegion string,
               :additionalExperiments [string],
               :zone string,
               :machineType string,
               :tempLocation string,
               :numWorkers integer,
               :serviceAccountEmail string,
               :bypassTempDirValidation boolean,
               :ipConfiguration string,
               :kmsKeyName string,
               :network string,
               :workerZone string,
               :additionalUserLabels {},
               :subnetwork string},
 :update boolean}

Launch a template.
raw docstring

workerMessages$clj

(workerMessages$ auth args body)

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages

Required parameters: projectId

Optional parameters: none

Body:

{:workerMessages [{:workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMessageCode WorkerMessageCode, :workerMetrics ResourceUtilizationReport, :labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent}], :location string}

Send a worker_message to the service.

https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages

Required parameters: projectId

Optional parameters: none

Body: 

{:workerMessages [{:workerShutdownNotice WorkerShutdownNotice,
                   :workerHealthReport WorkerHealthReport,
                   :workerMessageCode WorkerMessageCode,
                   :workerMetrics ResourceUtilizationReport,
                   :labels {},
                   :time string,
                   :workerLifecycleEvent WorkerLifecycleEvent}],
 :location string}

Send a worker_message to the service.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close