For a batch pipeline, you can pass scheduler information. Data Pipelines uses the scheduler information to create an internal scheduler that runs jobs periodically.
If the internal scheduler is not configured, you can use pipelines.run to run jobs.
HTTP request
POST https://datapipelines.googleapis.com/v1/{parent=projects/*/locations/*}/pipelines
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-01-03 UTC."],[[["This endpoint allows the creation of a new pipeline, which can be configured with scheduler information for periodic job runs or manually triggered using `pipelines.run`."],["The HTTP request utilizes the `POST` method and follows gRPC Transcoding syntax, requiring a `parent` path parameter representing the location."],["The request body should include an instance of the `Pipeline` resource, while the successful response will also contain a newly created `Pipeline` instance."],["Creating a pipeline requires the OAuth scope `https://www.googleapis.com/auth/cloud-platform` and the IAM permission `datapipelines.pipelines.create` on the parent resource."],["The parent parameter is required and should be the name of the location."]]],[]]