On the Data tab, go to a column name and click the
arrow_drop_down
expander arrow.
Select Send to error, and then select the condition that sends bad
records to error.
Wrangler removes values that match the specified condition from the sample and
adds the send to error directive to the recipe. When you run the data
pipeline, the transformation is applied to all values in the column.
Add an error collector plugin to a data pipeline
When you add a Wrangler transformation with a recipe that includes the send to
error directive to a data pipeline, you can choose to connect it to the Error
Collector plugin. The Error Collector plugin is usually connected to a
downstream sink plugin, such as a BigQuery sink.
When you run the pipeline, the records flagged by the send to error directive
go from the Wrangler transformation step in your pipeline, to the Error Collector
step, to the sink step. When the run finishes, you can examine those flagged
records written to the sink.
If your recipe includes the send to error transformation, but the pipeline
doesn't include the Error Collector plugin, the records flagged by the send to
error directive are dropped during the pipeline run.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-29 UTC."],[[["\u003cp\u003eWrangler, within Cloud Data Fusion Studio, helps remove systemic, logical, and data errors from datasets using over 50 directives.\u003c/p\u003e\n"],["\u003cp\u003eYou can send bad records to error in Wrangler by selecting the 'Send to error' option under a column and specifying the conditions for flagging the record.\u003c/p\u003e\n"],["\u003cp\u003eThe 'send to error' directive, when added to a pipeline's recipe in Wrangler, flags records which can be sent to the Error Collector plugin and subsequently a sink.\u003c/p\u003e\n"],["\u003cp\u003eIf the Error Collector plugin is not used, records flagged by the 'send to error' directive are dropped during the pipeline run.\u003c/p\u003e\n"]]],[],null,["# Send records to error\n\nThis page explains how to remove common errors from a dataset when you prepare\ndata in the Wrangler workspace of the Cloud Data Fusion Studio.\n\nThe following types of errors occur in datasets:\n\n- Systemic errors, such as service or instance failures\n- Logical errors, such as pipeline run failures\n- Data errors, such as invalid credit card numbers, invalid date formats, or invalid zip codes\n\nWrangler provides a set of over 50 directives to help you remove common errors\nfrom a dataset.\n\nTo send records to error, follow these steps:\n\n1. [Go to the Wrangler workspace in Cloud Data Fusion](/data-fusion/docs/concepts/wrangler-overview#navigate-to-wrangler).\n2. On the **Data** tab, go to a column name and click the arrow_drop_down expander arrow.\n3. Select **Send to error**, and then select the condition that sends bad records to error.\n\nWrangler removes values that match the specified condition from the sample and\nadds the `send to error` directive to the recipe. When you run the data\npipeline, the transformation is applied to all values in the column.\n\nAdd an error collector plugin to a data pipeline\n------------------------------------------------\n\nWhen you add a Wrangler transformation with a recipe that includes the `send to\nerror` directive to a data pipeline, you can choose to connect it to the Error\nCollector plugin. The Error Collector plugin is usually connected to a\ndownstream sink plugin, such as a BigQuery sink.\n\nWhen you run the pipeline, the records flagged by the `send to error` directive\ngo from the Wrangler transformation step in your pipeline, to the Error Collector\nstep, to the sink step. When the run finishes, you can examine those flagged\nrecords written to the sink.\n\nIf your recipe includes the `send to error` transformation, but the pipeline\ndoesn't include the Error Collector plugin, the records flagged by the `send to\nerror` directive are dropped during the pipeline run.\n\nWhat's next\n-----------\n\n- Learn more about [Wrangler directives](/data-fusion/docs/concepts/wrangler-overview#apply_directives)."]]