Dataflow API: projects. Manages Google Cloud Dataflow projects on Google Cloud Platform. See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects
Dataflow API: projects. Manages Google Cloud Dataflow projects on Google Cloud Platform. See: https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects
(deleteSnapshots$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/deleteSnapshots
Required parameters: projectId
Optional parameters: snapshotId, location
Deletes a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/deleteSnapshots Required parameters: projectId Optional parameters: snapshotId, location Deletes a snapshot.
(jobs-aggregated$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated
Required parameters: projectId
Optional parameters: location, filter, pageSize, pageToken, view
List the jobs of a project across all regions.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/aggregated Required parameters: projectId Optional parameters: location, filter, pageSize, pageToken, view List the jobs of a project across all regions.
(jobs-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create
Required parameters: projectId
Optional parameters: replaceJobId, view, location
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Creates a Cloud Dataflow job. To create a job, we recommend using projects.locations.jobs.create
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.create
is not recommended, as your job will always start in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/create Required parameters: projectId Optional parameters: replaceJobId, view, location Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Creates a Cloud Dataflow job. To create a job, we recommend using `projects.locations.jobs.create` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.create` is not recommended, as your job will always start in `us-central1`.
(jobs-debug-getConfig$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig
Required parameters: jobId, projectId
Optional parameters: none
Body:
{:workerId string, :componentId string, :location string}
Get encoded debug configuration for component. Not cacheable.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/getConfig Required parameters: jobId, projectId Optional parameters: none Body: {:workerId string, :componentId string, :location string} Get encoded debug configuration for component. Not cacheable.
(jobs-debug-sendCapture$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:location string, :workerId string, :data string, :componentId string}
Send encoded debug capture data for component.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/debug/sendCapture Required parameters: projectId, jobId Optional parameters: none Body: {:location string, :workerId string, :data string, :componentId string} Send encoded debug capture data for component.
(jobs-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get
Required parameters: projectId, jobId
Optional parameters: location, view
Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using projects.locations.jobs.get
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.get
is not recommended, as you can only get the state of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/get Required parameters: projectId, jobId Optional parameters: location, view Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.get` is not recommended, as you can only get the state of jobs that are running in `us-central1`.
(jobs-getMetrics$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics
Required parameters: jobId, projectId
Optional parameters: location, startTime
Request the job status. To request the status of a job, we recommend using projects.locations.jobs.getMetrics
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.getMetrics
is not recommended, as you can only request the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/getMetrics Required parameters: jobId, projectId Optional parameters: location, startTime Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.getMetrics` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.getMetrics` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(jobs-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list
Required parameters: projectId
Optional parameters: filter, pageToken, pageSize, location, view
List the jobs of a project. To list the jobs of a project in a region, we recommend using projects.locations.jobs.list
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use projects.jobs.aggregated
. Using projects.jobs.list
is not recommended, as you can only get the list of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/list Required parameters: projectId Optional parameters: filter, pageToken, pageSize, location, view List the jobs of a project. To list the jobs of a project in a region, we recommend using `projects.locations.jobs.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use `projects.jobs.aggregated`. Using `projects.jobs.list` is not recommended, as you can only get the list of jobs that are running in `us-central1`.
(jobs-messages-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list
Required parameters: projectId, jobId
Optional parameters: location, pageToken, minimumImportance, pageSize, endTime, startTime
Request the job status. To request the status of a job, we recommend using projects.locations.jobs.messages.list
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.messages.list
is not recommended, as you can only request the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/messages/list Required parameters: projectId, jobId Optional parameters: location, pageToken, minimumImportance, pageSize, endTime, startTime Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.messages.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.messages.list` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(jobs-snapshot$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/snapshot
Required parameters: jobId, projectId
Optional parameters: none
Body:
{:snapshotSources boolean, :ttl string, :description string, :location string}
Snapshot the state of a streaming job.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/snapshot Required parameters: jobId, projectId Optional parameters: none Body: {:snapshotSources boolean, :ttl string, :description string, :location string} Snapshot the state of a streaming job.
(jobs-update$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update
Required parameters: projectId, jobId
Optional parameters: location
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using projects.locations.jobs.update
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.update
is not recommended, as you can only update the state of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/update Required parameters: projectId, jobId Optional parameters: location Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using `projects.locations.jobs.update` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.update` is not recommended, as you can only update the state of jobs that are running in `us-central1`.
(jobs-workItems-lease$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease
Required parameters: projectId, jobId
Optional parameters: none
Body:
{:location string, :unifiedWorkerRequest {}, :requestedLeaseDuration string, :workerCapabilities [string], :workItemTypes [string], :workerId string, :currentWorkerTime string}
Leases a dataflow WorkItem to run.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/lease Required parameters: projectId, jobId Optional parameters: none Body: {:location string, :unifiedWorkerRequest {}, :requestedLeaseDuration string, :workerCapabilities [string], :workItemTypes [string], :workerId string, :currentWorkerTime string} Leases a dataflow WorkItem to run.
(jobs-workItems-reportStatus$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus
Required parameters: jobId, projectId
Optional parameters: none
Body:
{:currentWorkerTime string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :location string, :unifiedWorkerRequest {}, :workerId string}
Reports the status of dataflow WorkItems leased by a worker.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/jobs/workItems/reportStatus Required parameters: jobId, projectId Optional parameters: none Body: {:currentWorkerTime string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :location string, :unifiedWorkerRequest {}, :workerId string} Reports the status of dataflow WorkItems leased by a worker.
(locations-flexTemplates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch
Required parameters: location, projectId
Optional parameters: none
Body:
{:validateOnly boolean, :launchParameter {:containerSpec ContainerSpec, :containerSpecGcsPath string, :update boolean, :parameters {}, :jobName string, :launchOptions {}, :environment FlexTemplateRuntimeEnvironment, :transformNameMappings {}}}
Launch a job with a FlexTemplate.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/flexTemplates/launch Required parameters: location, projectId Optional parameters: none Body: {:validateOnly boolean, :launchParameter {:containerSpec ContainerSpec, :containerSpecGcsPath string, :update boolean, :parameters {}, :jobName string, :launchOptions {}, :environment FlexTemplateRuntimeEnvironment, :transformNameMappings {}}} Launch a job with a FlexTemplate.
(locations-jobs-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create
Required parameters: location, projectId
Optional parameters: view, replaceJobId
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Creates a Cloud Dataflow job. To create a job, we recommend using projects.locations.jobs.create
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.create
is not recommended, as your job will always start in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/create Required parameters: location, projectId Optional parameters: view, replaceJobId Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Creates a Cloud Dataflow job. To create a job, we recommend using `projects.locations.jobs.create` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.create` is not recommended, as your job will always start in `us-central1`.
(locations-jobs-debug-getConfig$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig
Required parameters: location, projectId, jobId
Optional parameters: none
Body:
{:workerId string, :componentId string, :location string}
Get encoded debug configuration for component. Not cacheable.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/getConfig Required parameters: location, projectId, jobId Optional parameters: none Body: {:workerId string, :componentId string, :location string} Get encoded debug configuration for component. Not cacheable.
(locations-jobs-debug-sendCapture$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture
Required parameters: jobId, location, projectId
Optional parameters: none
Body:
{:location string, :workerId string, :data string, :componentId string}
Send encoded debug capture data for component.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/debug/sendCapture Required parameters: jobId, location, projectId Optional parameters: none Body: {:location string, :workerId string, :data string, :componentId string} Send encoded debug capture data for component.
(locations-jobs-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get
Required parameters: projectId, jobId, location
Optional parameters: view
Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using projects.locations.jobs.get
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.get
is not recommended, as you can only get the state of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/get Required parameters: projectId, jobId, location Optional parameters: view Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.get` is not recommended, as you can only get the state of jobs that are running in `us-central1`.
(locations-jobs-getExecutionDetails$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getExecutionDetails
Required parameters: jobId, location, projectId
Optional parameters: pageSize, pageToken
Request detailed information about the execution status of the job. EXPERIMENTAL. This API is subject to change or removal without notice.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getExecutionDetails Required parameters: jobId, location, projectId Optional parameters: pageSize, pageToken Request detailed information about the execution status of the job. EXPERIMENTAL. This API is subject to change or removal without notice.
(locations-jobs-getMetrics$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics
Required parameters: projectId, jobId, location
Optional parameters: startTime
Request the job status. To request the status of a job, we recommend using projects.locations.jobs.getMetrics
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.getMetrics
is not recommended, as you can only request the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/getMetrics Required parameters: projectId, jobId, location Optional parameters: startTime Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.getMetrics` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.getMetrics` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(locations-jobs-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list
Required parameters: location, projectId
Optional parameters: pageSize, filter, view, pageToken
List the jobs of a project. To list the jobs of a project in a region, we recommend using projects.locations.jobs.list
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use projects.jobs.aggregated
. Using projects.jobs.list
is not recommended, as you can only get the list of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/list Required parameters: location, projectId Optional parameters: pageSize, filter, view, pageToken List the jobs of a project. To list the jobs of a project in a region, we recommend using `projects.locations.jobs.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use `projects.jobs.aggregated`. Using `projects.jobs.list` is not recommended, as you can only get the list of jobs that are running in `us-central1`.
(locations-jobs-messages-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list
Required parameters: location, projectId, jobId
Optional parameters: endTime, minimumImportance, pageSize, startTime, pageToken
Request the job status. To request the status of a job, we recommend using projects.locations.jobs.messages.list
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.messages.list
is not recommended, as you can only request the status of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/messages/list Required parameters: location, projectId, jobId Optional parameters: endTime, minimumImportance, pageSize, startTime, pageToken Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.messages.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.messages.list` is not recommended, as you can only request the status of jobs that are running in `us-central1`.
(locations-jobs-snapshot$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshot
Required parameters: jobId, location, projectId
Optional parameters: none
Body:
{:snapshotSources boolean, :ttl string, :description string, :location string}
Snapshot the state of a streaming job.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshot Required parameters: jobId, location, projectId Optional parameters: none Body: {:snapshotSources boolean, :ttl string, :description string, :location string} Snapshot the state of a streaming job.
(locations-jobs-snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshots/list
Required parameters: projectId, jobId, location
Optional parameters: none
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/snapshots/list Required parameters: projectId, jobId, location Optional parameters: none Lists snapshots.
(locations-jobs-stages-getExecutionDetails$ auth parameters)
Required parameters: stageId, jobId, projectId, location
Optional parameters: endTime, startTime, pageToken, pageSize
Request detailed information about the execution status of a stage of the job. EXPERIMENTAL. This API is subject to change or removal without notice.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/stages/getExecutionDetails Required parameters: stageId, jobId, projectId, location Optional parameters: endTime, startTime, pageToken, pageSize Request detailed information about the execution status of a stage of the job. EXPERIMENTAL. This API is subject to change or removal without notice.
(locations-jobs-update$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update
Required parameters: jobId, location, projectId
Optional parameters: none
Body:
{:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string}
Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using projects.locations.jobs.update
with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using projects.jobs.update
is not recommended, as you can only update the state of jobs that are running in us-central1
.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/update Required parameters: jobId, location, projectId Optional parameters: none Body: {:labels {}, :stepsLocation string, :executionInfo {:stages {}}, :jobMetadata {:fileDetails [FileIODetails], :pubsubDetails [PubSubIODetails], :bigTableDetails [BigTableIODetails], :sdkVersion SdkVersion, :datastoreDetails [DatastoreIODetails], :spannerDetails [SpannerIODetails], :bigqueryDetails [BigQueryIODetails]}, :clientRequestId string, :startTime string, :stageStates [{:executionStageName string, :currentStateTime string, :executionStageState string}], :name string, :steps [{:name string, :properties {}, :kind string}], :createTime string, :currentStateTime string, :type string, :transformNameMapping {}, :replaceJobId string, :pipelineDescription {:executionPipelineStage [ExecutionStageSummary], :originalPipelineTransform [TransformSummary], :displayData [DisplayData]}, :replacedByJobId string, :satisfiesPzs boolean, :currentState string, :tempFiles [string], :id string, :createdFromSnapshotId string, :environment {:flexResourceSchedulingGoal string, :internalExperiments {}, :experiments [string], :workerRegion string, :shuffleMode string, :debugOptions DebugOptions, :sdkPipelineOptions {}, :serviceKmsKeyName string, :tempStoragePrefix string, :serviceAccountEmail string, :clusterManagerApiService string, :userAgent {}, :serviceOptions [string], :workerZone string, :version {}, :workerPools [WorkerPool], :dataset string}, :projectId string, :requestedState string, :location string} Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using `projects.locations.jobs.update` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.update` is not recommended, as you can only update the state of jobs that are running in `us-central1`.
(locations-jobs-workItems-lease$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease
Required parameters: jobId, location, projectId
Optional parameters: none
Body:
{:location string, :unifiedWorkerRequest {}, :requestedLeaseDuration string, :workerCapabilities [string], :workItemTypes [string], :workerId string, :currentWorkerTime string}
Leases a dataflow WorkItem to run.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/lease Required parameters: jobId, location, projectId Optional parameters: none Body: {:location string, :unifiedWorkerRequest {}, :requestedLeaseDuration string, :workerCapabilities [string], :workItemTypes [string], :workerId string, :currentWorkerTime string} Leases a dataflow WorkItem to run.
(locations-jobs-workItems-reportStatus$ auth parameters body)
Required parameters: jobId, location, projectId
Optional parameters: none
Body:
{:currentWorkerTime string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :location string, :unifiedWorkerRequest {}, :workerId string}
Reports the status of dataflow WorkItems leased by a worker.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/jobs/workItems/reportStatus Required parameters: jobId, location, projectId Optional parameters: none Body: {:currentWorkerTime string, :workItemStatuses [{:stopPosition Position, :sourceFork SourceFork, :sourceOperationResponse SourceOperationResponse, :errors [Status], :reportedProgress ApproximateReportedProgress, :completed boolean, :workItemId string, :reportIndex string, :totalThrottlerWaitTimeSeconds number, :metricUpdates [MetricUpdate], :progress ApproximateProgress, :dynamicSourceSplit DynamicSourceSplit, :counterUpdates [CounterUpdate], :requestedLeaseDuration string}], :location string, :unifiedWorkerRequest {}, :workerId string} Reports the status of dataflow WorkItems leased by a worker.
(locations-snapshots-delete$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/delete
Required parameters: projectId, location, snapshotId
Optional parameters: none
Deletes a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/delete Required parameters: projectId, location, snapshotId Optional parameters: none Deletes a snapshot.
(locations-snapshots-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/get
Required parameters: location, projectId, snapshotId
Optional parameters: none
Gets information about a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/get Required parameters: location, projectId, snapshotId Optional parameters: none Gets information about a snapshot.
(locations-snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/list
Required parameters: location, projectId
Optional parameters: jobId
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/snapshots/list Required parameters: location, projectId Optional parameters: jobId Lists snapshots.
(locations-sql-validate$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate
Required parameters: projectId, location
Optional parameters: query
Validates a GoogleSQL query for Cloud Dataflow syntax. Will always confirm the given query parses correctly, and if able to look up schema information from DataCatalog, will validate that the query analyzes properly as well.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/sql/validate Required parameters: projectId, location Optional parameters: query Validates a GoogleSQL query for Cloud Dataflow syntax. Will always confirm the given query parses correctly, and if able to look up schema information from DataCatalog, will validate that the query analyzes properly as well.
(locations-templates-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create
Required parameters: location, projectId
Optional parameters: none
Body:
{:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :jobName string, :gcsPath string, :parameters {}}
Creates a Cloud Dataflow job from a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/create Required parameters: location, projectId Optional parameters: none Body: {:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :jobName string, :gcsPath string, :parameters {}} Creates a Cloud Dataflow job from a template.
(locations-templates-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get
Required parameters: projectId, location
Optional parameters: view, gcsPath
Get the template associated with a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/get Required parameters: projectId, location Optional parameters: view, gcsPath Get the template associated with a template.
(locations-templates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch
Required parameters: projectId, location
Optional parameters: gcsPath, validateOnly, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation
Body:
{:parameters {}, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean, :jobName string}
Launch a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/templates/launch Required parameters: projectId, location Optional parameters: gcsPath, validateOnly, dynamicTemplate.gcsPath, dynamicTemplate.stagingLocation Body: {:parameters {}, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean, :jobName string} Launch a template.
(locations-workerMessages$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages
Required parameters: projectId, location
Optional parameters: none
Body:
{:location string, :workerMessages [{:workerMessageCode WorkerMessageCode, :labels {}, :workerLifecycleEvent WorkerLifecycleEvent, :workerHealthReport WorkerHealthReport, :time string, :workerMetrics ResourceUtilizationReport, :workerShutdownNotice WorkerShutdownNotice}]}
Send a worker_message to the service.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/locations/workerMessages Required parameters: projectId, location Optional parameters: none Body: {:location string, :workerMessages [{:workerMessageCode WorkerMessageCode, :labels {}, :workerLifecycleEvent WorkerLifecycleEvent, :workerHealthReport WorkerHealthReport, :time string, :workerMetrics ResourceUtilizationReport, :workerShutdownNotice WorkerShutdownNotice}]} Send a worker_message to the service.
(snapshots-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/get
Required parameters: snapshotId, projectId
Optional parameters: location
Gets information about a snapshot.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/get Required parameters: snapshotId, projectId Optional parameters: location Gets information about a snapshot.
(snapshots-list$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/list
Required parameters: projectId
Optional parameters: location, jobId
Lists snapshots.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/snapshots/list Required parameters: projectId Optional parameters: location, jobId Lists snapshots.
(templates-create$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create
Required parameters: projectId
Optional parameters: none
Body:
{:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :jobName string, :gcsPath string, :parameters {}}
Creates a Cloud Dataflow job from a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/create Required parameters: projectId Optional parameters: none Body: {:environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :location string, :jobName string, :gcsPath string, :parameters {}} Creates a Cloud Dataflow job from a template.
(templates-get$ auth parameters)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get
Required parameters: projectId
Optional parameters: location, gcsPath, view
Get the template associated with a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/get Required parameters: projectId Optional parameters: location, gcsPath, view Get the template associated with a template.
(templates-launch$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch
Required parameters: projectId
Optional parameters: dynamicTemplate.stagingLocation, dynamicTemplate.gcsPath, validateOnly, location, gcsPath
Body:
{:parameters {}, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean, :jobName string}
Launch a template.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/templates/launch Required parameters: projectId Optional parameters: dynamicTemplate.stagingLocation, dynamicTemplate.gcsPath, validateOnly, location, gcsPath Body: {:parameters {}, :transformNameMapping {}, :environment {:maxWorkers integer, :workerRegion string, :additionalExperiments [string], :zone string, :machineType string, :tempLocation string, :numWorkers integer, :serviceAccountEmail string, :bypassTempDirValidation boolean, :ipConfiguration string, :kmsKeyName string, :enableStreamingEngine boolean, :network string, :workerZone string, :additionalUserLabels {}, :subnetwork string}, :update boolean, :jobName string} Launch a template.
(workerMessages$ auth parameters body)
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages
Required parameters: projectId
Optional parameters: none
Body:
{:location string, :workerMessages [{:workerMessageCode WorkerMessageCode, :labels {}, :workerLifecycleEvent WorkerLifecycleEvent, :workerHealthReport WorkerHealthReport, :time string, :workerMetrics ResourceUtilizationReport, :workerShutdownNotice WorkerShutdownNotice}]}
Send a worker_message to the service.
https://cloud.google.com/dataflowapi/reference/rest/v1b3/projects/workerMessages Required parameters: projectId Optional parameters: none Body: {:location string, :workerMessages [{:workerMessageCode WorkerMessageCode, :labels {}, :workerLifecycleEvent WorkerLifecycleEvent, :workerHealthReport WorkerHealthReport, :time string, :workerMetrics ResourceUtilizationReport, :workerShutdownNotice WorkerShutdownNotice}]} Send a worker_message to the service.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close