Data pipelines API: projects. Data Pipelines provides an interface for creating, updating, and managing recurring Data Analytics jobs. See: https://cloud.google.com/dataflow/docs/guides/data-pipelinesdocs/reference/rest/v1/projects
Data pipelines API: projects. Data Pipelines provides an interface for creating, updating, and managing recurring Data Analytics jobs. See: https://cloud.google.com/dataflow/docs/guides/data-pipelinesdocs/reference/rest/v1/projects
(locations-pipelines-create$ auth parameters body)
Required parameters: parent
Optional parameters: none
Body:
{:schedulerServiceAccountEmail string, :scheduleInfo {:schedule string, :timeZone string, :nextJobTime string}, :jobCount integer, :displayName string, :name string, :createTime string, :type string, :state string, :pipelineSources {}, :lastUpdateTime string, :workload {:dataflowFlexTemplateRequest GoogleCloudDatapipelinesV1LaunchFlexTemplateRequest, :dataflowLaunchTemplateRequest GoogleCloudDatapipelinesV1LaunchTemplateRequest}}
Creates a pipeline. For a batch pipeline, you can pass scheduler information. Data Pipelines uses the scheduler information to create an internal scheduler that runs jobs periodically. If the internal scheduler is not configured, you can use RunPipeline to run jobs.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/create Required parameters: parent Optional parameters: none Body: {:schedulerServiceAccountEmail string, :scheduleInfo {:schedule string, :timeZone string, :nextJobTime string}, :jobCount integer, :displayName string, :name string, :createTime string, :type string, :state string, :pipelineSources {}, :lastUpdateTime string, :workload {:dataflowFlexTemplateRequest GoogleCloudDatapipelinesV1LaunchFlexTemplateRequest, :dataflowLaunchTemplateRequest GoogleCloudDatapipelinesV1LaunchTemplateRequest}} Creates a pipeline. For a batch pipeline, you can pass scheduler information. Data Pipelines uses the scheduler information to create an internal scheduler that runs jobs periodically. If the internal scheduler is not configured, you can use RunPipeline to run jobs.
(locations-pipelines-delete$ auth parameters)
Required parameters: name
Optional parameters: none
Deletes a pipeline. If a scheduler job is attached to the pipeline, it will be deleted.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/delete Required parameters: name Optional parameters: none Deletes a pipeline. If a scheduler job is attached to the pipeline, it will be deleted.
(locations-pipelines-get$ auth parameters)
Required parameters: name
Optional parameters: none
Looks up a single pipeline. Returns a "NOT_FOUND" error if no such pipeline exists. Returns a "FORBIDDEN" error if the caller doesn't have permission to access it.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/get Required parameters: name Optional parameters: none Looks up a single pipeline. Returns a "NOT_FOUND" error if no such pipeline exists. Returns a "FORBIDDEN" error if the caller doesn't have permission to access it.
(locations-pipelines-jobs-list$ auth parameters)
Required parameters: parent
Optional parameters: pageToken, pageSize
Lists jobs for a given pipeline. Throws a "FORBIDDEN" error if the caller doesn't have permission to access it.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/jobs/list Required parameters: parent Optional parameters: pageToken, pageSize Lists jobs for a given pipeline. Throws a "FORBIDDEN" error if the caller doesn't have permission to access it.
(locations-pipelines-list$ auth parameters)
Required parameters: parent
Optional parameters: pageSize, filter, pageToken
Lists pipelines. Returns a "FORBIDDEN" error if the caller doesn't have permission to access it.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/list Required parameters: parent Optional parameters: pageSize, filter, pageToken Lists pipelines. Returns a "FORBIDDEN" error if the caller doesn't have permission to access it.
(locations-pipelines-patch$ auth parameters body)
Required parameters: name
Optional parameters: updateMask
Body:
{:schedulerServiceAccountEmail string, :scheduleInfo {:schedule string, :timeZone string, :nextJobTime string}, :jobCount integer, :displayName string, :name string, :createTime string, :type string, :state string, :pipelineSources {}, :lastUpdateTime string, :workload {:dataflowFlexTemplateRequest GoogleCloudDatapipelinesV1LaunchFlexTemplateRequest, :dataflowLaunchTemplateRequest GoogleCloudDatapipelinesV1LaunchTemplateRequest}}
Updates a pipeline. If successful, the updated Pipeline is returned. Returns NOT_FOUND
if the pipeline doesn't exist. If UpdatePipeline does not return successfully, you can retry the UpdatePipeline request until you receive a successful response.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/patch Required parameters: name Optional parameters: updateMask Body: {:schedulerServiceAccountEmail string, :scheduleInfo {:schedule string, :timeZone string, :nextJobTime string}, :jobCount integer, :displayName string, :name string, :createTime string, :type string, :state string, :pipelineSources {}, :lastUpdateTime string, :workload {:dataflowFlexTemplateRequest GoogleCloudDatapipelinesV1LaunchFlexTemplateRequest, :dataflowLaunchTemplateRequest GoogleCloudDatapipelinesV1LaunchTemplateRequest}} Updates a pipeline. If successful, the updated Pipeline is returned. Returns `NOT_FOUND` if the pipeline doesn't exist. If UpdatePipeline does not return successfully, you can retry the UpdatePipeline request until you receive a successful response.
(locations-pipelines-run$ auth parameters body)
Required parameters: name
Optional parameters: none
Body:
{}
Creates a job for the specified pipeline directly. You can use this method when the internal scheduler is not configured and you want to trigger the job directly or through an external system. Returns a "NOT_FOUND" error if the pipeline doesn't exist. Returns a "FORBIDDEN" error if the user doesn't have permission to access the pipeline or run jobs for the pipeline.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/run Required parameters: name Optional parameters: none Body: {} Creates a job for the specified pipeline directly. You can use this method when the internal scheduler is not configured and you want to trigger the job directly or through an external system. Returns a "NOT_FOUND" error if the pipeline doesn't exist. Returns a "FORBIDDEN" error if the user doesn't have permission to access the pipeline or run jobs for the pipeline.
(locations-pipelines-stop$ auth parameters body)
Required parameters: name
Optional parameters: none
Body:
{}
Freezes pipeline execution permanently. If there's a corresponding scheduler entry, it's deleted, and the pipeline state is changed to "ARCHIVED". However, pipeline metadata is retained.
https://cloud.google.com/dataflow/docs/guides/data-pipelinesapi/reference/rest/v1/projects/locations/pipelines/stop Required parameters: name Optional parameters: none Body: {} Freezes pipeline execution permanently. If there's a corresponding scheduler entry, it's deleted, and the pipeline state is changed to "ARCHIVED". However, pipeline metadata is retained.
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close