Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
A página Custo estimado no console do Google Cloud mostra o custo estimado do seu job atual do
Dataflow. Os custos estimados são calculados multiplicando as métricas de uso de recursos
conforme mostrado no Cloud Monitoring pelo
preço desses recursos na região do job.
Para conferir o custo estimado de um job, siga estas etapas:
No console Google Cloud , acesse a página Dataflow >
Jobs.
As estimativas de custo do job estão disponíveis para jobs em lote e de streaming. A página Custo estimado no console Google Cloud fornece as seguintes informações:
Detalhes sobre quais recursos contribuem para o custo do job e o quanto.
Os recursos incluem vCPUs, memória, dados do Dataflow Shuffle processados
ou do Streaming Engine e uso do disco SSD e HDD.
Custos em janelas de tempo específicas, como tempo desde o início do job, a hora
anterior, as últimas 24 horas, os sete dias anteriores e um período especificado pelo usuário.
Use alertas de monitoramento para receber notificações quando os custos dos jobs ultrapassarem um limite específico.
Também é possível usar alertas para fazer alterações nos jobs, como interromper ou cancelar jobs, com base nos limites definidos.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2025-09-02 UTC."],[[["\u003cp\u003eThe Estimated Cost page in the Google Cloud console provides a breakdown of the estimated costs for Dataflow jobs, based on resource usage and pricing.\u003c/p\u003e\n"],["\u003cp\u003eEstimated costs are derived from resource usage metrics in Cloud Monitoring, including vCPUs, memory, Dataflow Shuffle or Streaming Engine data, and disk usage.\u003c/p\u003e\n"],["\u003cp\u003eUsers can monitor job costs over various timeframes, such as since job start, the previous hour, the last 24 hours, the past seven days, or a custom range.\u003c/p\u003e\n"],["\u003cp\u003eCloud Monitoring alerts can be set up to notify users when job costs reach specified thresholds, allowing for actions like stopping or canceling jobs.\u003c/p\u003e\n"],["\u003cp\u003eCost monitoring is not available for Dataflow Prime jobs or GPU metrics.\u003c/p\u003e\n"]]],[],null,["# Dataflow estimated cost\n\nThe **Estimated Cost** page in the Google Cloud console shows the estimated cost of your current Dataflow\njob. Estimated costs are calculated by multiplying the resource usage metrics\nas shown in Cloud Monitoring by\nthe [price of those resources in the job region](/dataflow/pricing).\n\nTo view the estimated cost for a job, perform the following steps:\n\n1. In the Google Cloud console, go to the **Dataflow** \\\u003e\n **Jobs** page.\n\n [Go to Jobs](https://console.cloud.google.com/dataflow/jobs)\n2. Select a job.\n\n3. Click the **Cost** tab.\n\n| **Warning:** The estimated cost might not reflect the actual job cost for a variety of reasons, such as contractual discounts or temporary billing adjustments. To view the actual cost of your Dataflow jobs, view the [Cloud Billing reports for your Cloud Billing account](/billing/docs/how-to/reports#getting_started) in the Google Cloud console.\n\nUse cost monitoring\n-------------------\n\nJob cost estimates are available for both batch and streaming jobs. The\n**Estimated Cost** page in the Google Cloud console provides the following\ninformation:\n\n- Details about which resources contribute to the job cost and by how much. Resources include vCPUs, memory, Dataflow Shuffle data processed or Streaming Engine data processed, and SSD and HDD disk usage.\n- Costs over specific time windows, such as: time since the job started, the previous hour, the last 24 hours, the preceding seven days, and a user-specified time range.\n\nYou can use monitoring alerts to get notifications when your job costs cross a specified threshold.\nYou can also use alerts to make changes to your jobs, such as stopping or canceling jobs,\nbased on the thresholds that you set.\n\nTo create a Cloud Monitoring alert rule, click **Create alert** .\nFor instructions about how to configure these alerts, see\n[Use Cloud Monitoring for Dataflow pipelines](/dataflow/docs/guides/using-cloud-monitoring).\n\nLimitations\n-----------\n\nDataflow cost monitoring does not support\nDataflow Prime jobs and GPU metrics."]]