Dataflow API: projects. Manages Google Cloud Dataflow projects on Google Cloud Platform. See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects
Dataflow API: projects. Manages Google Cloud Dataflow projects on Google Cloud Platform. See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects
(deleteSnapshots$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/deleteSnapshots
Required parameters: projectId
Optional parameters: location, snapshotId
Deletes a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/deleteSnapshots Required parameters: projectId Optional parameters: location, snapshotId Deletes a snapshot.
(jobs-aggregated$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated
Required parameters: projectId
Optional parameters: filter, location, pageToken, pageSize, view
List the jobs of a project across all regions.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated Required parameters: projectId Optional parameters: filter, location, pageToken, pageSize, view List the jobs of a project across all regions.
(jobs-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create
Required parameters: projectId
Optional parameters: location, replaceJobId, view
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Creates a Cloud Dataflow job.
To create a job, we recommend using projects.locations.jobs.create
with a
[regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.create
is not recommended, as your job will always start
in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create Required parameters: projectId Optional parameters: location, replaceJobId, view Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Creates a Cloud Dataflow job. To create a job, we recommend using `projects.locations.jobs.create` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.create` is not recommended, as your job will always start in `us-central1`.
(jobs-debug-getConfig$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:componentId string, :workerId string, :location string}
Get encoded debug configuration for component. Not cacheable.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig Required parameters: projectId, jobId Optional parameters: none Body: {:componentId string, :workerId string, :location string} Get encoded debug configuration for component. Not cacheable.
(jobs-debug-sendCapture$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:location string, :data string, :componentId string, :workerId string}
Send encoded debug capture data for component.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture Required parameters: projectId, jobId Optional parameters: none Body: {:location string, :data string, :componentId string, :workerId string} Send encoded debug capture data for component.
(jobs-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get
Required parameters: projectId, jobId
Optional parameters: view, location
Gets the state of the specified Cloud Dataflow job.
To get the state of a job, we recommend using projects.locations.jobs.get
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.get
is not recommended, as you can only get the state of
jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get Required parameters: projectId, jobId Optional parameters: view, location Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.get` is not recommended, as you can only get the state of jobs that are running in `us-central1`.
(jobs-getMetrics$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics
Required parameters: projectId, jobId
Optional parameters: location, startTime
Request the job status.
To request the status of a job, we recommend using
projects.locations.jobs.getMetrics
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.getMetrics
is not recommended, as you can only request the
status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics Required parameters: projectId, jobId Optional parameters: location, startTime Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.getMetrics` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.getMetrics` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(jobs-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list
Required parameters: projectId
Optional parameters: pageSize, view, filter, location, pageToken
List the jobs of a project.
To list the jobs of a project in a region, we recommend using
projects.locations.jobs.get
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To
list the all jobs across all regions, use projects.jobs.aggregated
. Using
projects.jobs.list
is not recommended, as you can only get the list of
jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list Required parameters: projectId Optional parameters: pageSize, view, filter, location, pageToken List the jobs of a project. To list the jobs of a project in a region, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use `projects.jobs.aggregated`. Using `projects.jobs.list` is not recommended, as you can only get the list of jobs that are running in `us-central1`.
(jobs-messages-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list
Required parameters: projectId, jobId
Optional parameters: endTime, location, pageToken, startTime, pageSize, minimumImportance
Request the job status.
To request the status of a job, we recommend using
projects.locations.jobs.messages.list
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.messages.list
is not recommended, as you can only request
the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list Required parameters: projectId, jobId Optional parameters: endTime, location, pageToken, startTime, pageSize, minimumImportance Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.messages.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.messages.list` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(jobs-snapshot$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/snapshot
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:description string, :snapshotSources boolean, :ttl string, :location string}
Snapshot the state of a streaming job.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/snapshot Required parameters: projectId, jobId Optional parameters: none Body: {:description string, :snapshotSources boolean, :ttl string, :location string} Snapshot the state of a streaming job.
(jobs-update$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update
Required parameters: projectId, jobId
Optional parameters: location
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Updates the state of an existing Cloud Dataflow job.
To update the state of an existing job, we recommend using
projects.locations.jobs.update
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.update
is not recommended, as you can only update the state
of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update Required parameters: projectId, jobId Optional parameters: location Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using `projects.locations.jobs.update` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.update` is not recommended, as you can only update the state of jobs that are running in `us-central1`.
(jobs-workItems-lease$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:currentWorkerTime string, :workItemTypes [string], :location string, :unifiedWorkerRequest {}, :workerCapabilities [string], :workerId string, :requestedLeaseDuration string}
Leases a dataflow WorkItem to run.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease Required parameters: projectId, jobId Optional parameters: none Body: {:currentWorkerTime string, :workItemTypes [string], :location string, :unifiedWorkerRequest {}, :workerCapabilities [string], :workerId string, :requestedLeaseDuration string} Leases a dataflow WorkItem to run.
(jobs-workItems-reportStatus$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:workerId string, :currentWorkerTime string, :location string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}}
Reports the status of dataflow WorkItems leased by a worker.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus Required parameters: projectId, jobId Optional parameters: none Body: {:workerId string, :currentWorkerTime string, :location string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}} Reports the status of dataflow WorkItems leased by a worker.
(locations-flexTemplates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch
Required parameters: projectId, location
Optional parameters: none
Body:
{:validateOnly boolean, :launchParameter {:containerSpecGcsPath string, :parameters {}, :jobName string, :containerSpec ContainerSpec}}
Launch a job with a FlexTemplate.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch Required parameters: projectId, location Optional parameters: none Body: {:validateOnly boolean, :launchParameter {:containerSpecGcsPath string, :parameters {}, :jobName string, :containerSpec ContainerSpec}} Launch a job with a FlexTemplate.
(locations-jobs-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create
Required parameters: location, projectId
Optional parameters: replaceJobId, view
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Creates a Cloud Dataflow job.
To create a job, we recommend using projects.locations.jobs.create
with a
[regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.create
is not recommended, as your job will always start
in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create Required parameters: location, projectId Optional parameters: replaceJobId, view Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Creates a Cloud Dataflow job. To create a job, we recommend using `projects.locations.jobs.create` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.create` is not recommended, as your job will always start in `us-central1`.
(locations-jobs-debug-getConfig$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig
Required parameters: location, projectId, jobId
Optional parameters: none
Body:
{:componentId string, :workerId string, :location string}
Get encoded debug configuration for component. Not cacheable.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig Required parameters: location, projectId, jobId Optional parameters: none Body: {:componentId string, :workerId string, :location string} Get encoded debug configuration for component. Not cacheable.
(locations-jobs-debug-sendCapture$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture
Required parameters: projectId, jobId, location
Optional parameters: none
Body:
{:location string, :data string, :componentId string, :workerId string}
Send encoded debug capture data for component.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture Required parameters: projectId, jobId, location Optional parameters: none Body: {:location string, :data string, :componentId string, :workerId string} Send encoded debug capture data for component.
(locations-jobs-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get
Required parameters: location, projectId, jobId
Optional parameters: view
Gets the state of the specified Cloud Dataflow job.
To get the state of a job, we recommend using projects.locations.jobs.get
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.get
is not recommended, as you can only get the state of
jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get Required parameters: location, projectId, jobId Optional parameters: view Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.get` is not recommended, as you can only get the state of jobs that are running in `us-central1`.
(locations-jobs-getMetrics$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics
Required parameters: projectId, jobId, location
Optional parameters: startTime
Request the job status.
To request the status of a job, we recommend using
projects.locations.jobs.getMetrics
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.getMetrics
is not recommended, as you can only request the
status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics Required parameters: projectId, jobId, location Optional parameters: startTime Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.getMetrics` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.getMetrics` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(locations-jobs-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list
Required parameters: projectId, location
Optional parameters: filter, pageToken, pageSize, view
List the jobs of a project.
To list the jobs of a project in a region, we recommend using
projects.locations.jobs.get
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To
list the all jobs across all regions, use projects.jobs.aggregated
. Using
projects.jobs.list
is not recommended, as you can only get the list of
jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list Required parameters: projectId, location Optional parameters: filter, pageToken, pageSize, view List the jobs of a project. To list the jobs of a project in a region, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use `projects.jobs.aggregated`. Using `projects.jobs.list` is not recommended, as you can only get the list of jobs that are running in `us-central1`.
(locations-jobs-messages-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list
Required parameters: projectId, jobId, location
Optional parameters: endTime, pageToken, startTime, pageSize, minimumImportance
Request the job status.
To request the status of a job, we recommend using
projects.locations.jobs.messages.list
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.messages.list
is not recommended, as you can only request
the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list Required parameters: projectId, jobId, location Optional parameters: endTime, pageToken, startTime, pageSize, minimumImportance Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.messages.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.messages.list` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(locations-jobs-snapshot$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshot
Required parameters: projectId, jobId, location
Optional parameters: none
Body:
{:description string, :snapshotSources boolean, :ttl string, :location string}
Snapshot the state of a streaming job.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshot Required parameters: projectId, jobId, location Optional parameters: none Body: {:description string, :snapshotSources boolean, :ttl string, :location string} Snapshot the state of a streaming job.
(locations-jobs-snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshots/list
Required parameters: projectId, jobId, location
Optional parameters: none
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshots/list Required parameters: projectId, jobId, location Optional parameters: none Lists snapshots.
(locations-jobs-update$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update
Required parameters: projectId, jobId, location
Optional parameters: none
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Updates the state of an existing Cloud Dataflow job.
To update the state of an existing job, we recommend using
projects.locations.jobs.update
with a [regional endpoint]
(https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using
projects.jobs.update
is not recommended, as you can only update the state
of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update Required parameters: projectId, jobId, location Optional parameters: none Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:datastoreDetails [DatastoreIODetails], :sdkVersion SdkVersion, :fileDetails [FileIODetails], :bigqueryDetails [BigQueryIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :spannerDetails [SpannerIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageState string, :executionStageName string, :currentStateTime string}], :name string, :steps [{:properties {}, :name string, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:displayData [DisplayData], :executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary]}, :replacedByJobId string, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using `projects.locations.jobs.update` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.update` is not recommended, as you can only update the state of jobs that are running in `us-central1`.
(locations-jobs-workItems-lease$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease
Required parameters: location, projectId, jobId
Optional parameters: none
Body:
{:currentWorkerTime string, :workItemTypes [string], :location string, :unifiedWorkerRequest {}, :workerCapabilities [string], :workerId string, :requestedLeaseDuration string}
Leases a dataflow WorkItem to run.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease Required parameters: location, projectId, jobId Optional parameters: none Body: {:currentWorkerTime string, :workItemTypes [string], :location string, :unifiedWorkerRequest {}, :workerCapabilities [string], :workerId string, :requestedLeaseDuration string} Leases a dataflow WorkItem to run.
(locations-jobs-workItems-reportStatus$ auth parameters body)
Required parameters: projectId, jobId, location
Optional parameters: none
Body:
{:workerId string, :currentWorkerTime string, :location string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}}
Reports the status of dataflow WorkItems leased by a worker.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/reportStatus Required parameters: projectId, jobId, location Optional parameters: none Body: {:workerId string, :currentWorkerTime string, :location string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :unifiedWorkerRequest {}} Reports the status of dataflow WorkItems leased by a worker.
(locations-snapshots-delete$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/delete
Required parameters: projectId, snapshotId, location
Optional parameters: none
Deletes a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/delete Required parameters: projectId, snapshotId, location Optional parameters: none Deletes a snapshot.
(locations-snapshots-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/get
Required parameters: projectId, snapshotId, location
Optional parameters: none
Gets information about a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/get Required parameters: projectId, snapshotId, location Optional parameters: none Gets information about a snapshot.
(locations-snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/list
Required parameters: location, projectId
Optional parameters: jobId
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/list Required parameters: location, projectId Optional parameters: jobId Lists snapshots.
(locations-sql-validate$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate
Required parameters: location, projectId
Optional parameters: query
Validates a GoogleSQL query for Cloud Dataflow syntax. Will always confirm the given query parses correctly, and if able to look up schema information from DataCatalog, will validate that the query analyzes properly as well.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate Required parameters: location, projectId Optional parameters: query Validates a GoogleSQL query for Cloud Dataflow syntax. Will always confirm the given query parses correctly, and if able to look up schema information from DataCatalog, will validate that the query analyzes properly as well.
(locations-templates-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create
Required parameters: location, projectId
Optional parameters: none
Body:
{:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}, :jobName string, :gcsPath string}
Creates a Cloud Dataflow job from a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create Required parameters: location, projectId Optional parameters: none Body: {:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}, :jobName string, :gcsPath string} Creates a Cloud Dataflow job from a template.
(locations-templates-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get
Required parameters: projectId, location
Optional parameters: view, gcsPath
Get the template associated with a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get Required parameters: projectId, location Optional parameters: view, gcsPath Get the template associated with a template.
(locations-templates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch
Required parameters: projectId, location
Optional parameters: validateOnly, gcsPath, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation
Body:
{:update boolean, :parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}}
Launch a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch Required parameters: projectId, location Optional parameters: validateOnly, gcsPath, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation Body: {:update boolean, :parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}} Launch a template.
(locations-workerMessages$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages
Required parameters: location, projectId
Optional parameters: none
Body:
{:workerMessages [{:labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent, :workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMetrics ResourceUtilizationReport, :workerMessageCode WorkerMessageCode}], :location string}
Send a worker_message to the service.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages Required parameters: location, projectId Optional parameters: none Body: {:workerMessages [{:labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent, :workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMetrics ResourceUtilizationReport, :workerMessageCode WorkerMessageCode}], :location string} Send a worker_message to the service.
(snapshots-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/get
Required parameters: projectId, snapshotId
Optional parameters: location
Gets information about a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/get Required parameters: projectId, snapshotId Optional parameters: location Gets information about a snapshot.
(snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/list
Required parameters: projectId
Optional parameters: jobId, location
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/list Required parameters: projectId Optional parameters: jobId, location Lists snapshots.
(templates-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create
Required parameters: projectId
Optional parameters: none
Body:
{:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}, :jobName string, :gcsPath string}
Creates a Cloud Dataflow job from a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create Required parameters: projectId Optional parameters: none Body: {:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :parameters {}, :jobName string, :gcsPath string} Creates a Cloud Dataflow job from a template.
(templates-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get
Required parameters: projectId
Optional parameters: view, gcsPath, location
Get the template associated with a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get Required parameters: projectId Optional parameters: view, gcsPath, location Get the template associated with a template.
(templates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch
Required parameters: projectId
Optional parameters: validateOnly, gcsPath, location, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation
Body:
{:update boolean, :parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}}
Launch a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch Required parameters: projectId Optional parameters: validateOnly, gcsPath, location, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation Body: {:update boolean, :parameters {}, :jobName string, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}} Launch a template.
(workerMessages$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages
Required parameters: projectId
Optional parameters: none
Body:
{:workerMessages [{:labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent, :workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMetrics ResourceUtilizationReport, :workerMessageCode WorkerMessageCode}], :location string}
Send a worker_message to the service.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages Required parameters: projectId Optional parameters: none Body: {:workerMessages [{:labels {}, :time string, :workerLifecycleEvent WorkerLifecycleEvent, :workerShutdownNotice WorkerShutdownNotice, :workerHealthReport WorkerHealthReport, :workerMetrics ResourceUtilizationReport, :workerMessageCode WorkerMessageCode}], :location string} Send a worker_message to the service.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close