Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Dokumen ini memberikan ringkasan tentang pipeline konektivitas terkelola yang dapat Anda gunakan untuk mengimpor metadata dari sumber pihak ketiga ke Dataplex Universal Catalog.
Konektivitas terkelola memungkinkan Anda mengimpor metadata ke Dataplex Universal Catalog dalam skala besar. Pipeline konektivitas terkelola mengekstrak metadata dari sumber data Anda, lalu mengimpor metadata ke Dataplex Universal Catalog. Jika perlu, pipeline
juga membuat grup entri Dataplex Universal Catalog di
Google Cloud project Anda. Anda dapat mengatur alur kerja dan menjadwalkan tugas impor berdasarkan persyaratan Anda.
Anda membuat konektor kustom sendiri untuk mengekstrak metadata dari sumber pihak ketiga. Misalnya, Anda dapat membuat konektor untuk mengekstrak metadata dari sumber seperti MySQL, SQL Server, Oracle, Snowflake, Databricks, dan lainnya. Untuk mengetahui langkah-langkah dalam membangun konektor kustom contoh, lihat Mengembangkan konektor kustom untuk impor metadata.
Anda juga dapat menggunakan
konektor kustom yang dikontribusikan komunitas
yang tersedia untuk berbagai sumber pihak ketiga.
Pipeline konektivitas terkelola melakukan hal berikut:
Membuat grup entri target berdasarkan konfigurasi Anda, jika grup entri belum ada.
Menjalankan konektor. Konektor mengekstrak metadata dari sumber data Anda dan membuat file impor metadata yang dapat diimpor ke Katalog Universal Dataplex.
Memantau progres ekstraksi metadata.
Menjalankan tugas impor metadata untuk mengimpor metadata ke
Dataplex Universal Catalog.
Memantau progres tugas impor metadata.
Pipeline konektivitas terkelola menggunakan Dataproc Serverless untuk menjalankan
konektor, dan metode Dataplex Universal Catalog Metadata Import API
untuk menjalankan tugas impor metadata.
Metadata yang Anda impor terdiri dari entri Dataplex Universal Catalog dan aspeknya. Untuk mengetahui informasi selengkapnya tentang metadata Dataplex Universal Catalog, lihat Tentang pengelolaan metadata di Dataplex Universal Catalog.
Konektor kustom yang dikontribusikan komunitas
Untuk mengimpor metadata dari sumber pihak ketiga, Anda dapat menggunakan konektor kustom yang dikontribusikan oleh komunitas. Lihat file README setiap konektor untuk mengetahui petunjuk penyiapan dan informasi selengkapnya tentang konektor.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-08-19 UTC."],[[["\u003cp\u003eManaged connectivity pipelines facilitate the import of metadata from various third-party sources into Dataplex at scale.\u003c/p\u003e\n"],["\u003cp\u003eUsers can create custom connectors to extract metadata from sources like MySQL, SQL Server, Oracle, Snowflake, and Databricks.\u003c/p\u003e\n"],["\u003cp\u003eThe managed connectivity pipeline uses Dataproc Serverless to run connectors and imports the metadata into Dataplex Catalog.\u003c/p\u003e\n"],["\u003cp\u003eThe pipeline automatically creates Dataplex Catalog entry groups if they don't exist, and then runs the connector to extract and import metadata.\u003c/p\u003e\n"],["\u003cp\u003eThe workflows used in the pipelines allow for the scheduling and orchestration of metadata import jobs based on user needs.\u003c/p\u003e\n"]]],[],null,["# Managed connectivity overview\n\nThis document provides an overview of the managed connectivity pipelines that\nyou can use to import metadata from third-party sources into Dataplex Universal Catalog.\n\nManaged connectivity lets you import metadata into Dataplex Universal Catalog at\nscale. A managed connectivity pipeline extracts metadata from your data sources\nand then imports the metadata into Dataplex Universal Catalog. If necessary, the pipeline\nalso creates Dataplex Universal Catalog entry groups in your\nGoogle Cloud project. You can orchestrate the workflows, and schedule the\nimport jobs based on your requirements.\n\nYou build your own custom connectors to extract metadata from third-party\nsources. For example, you can build a connector to extract metadata from sources\nlike MySQL, SQL Server, Oracle, Snowflake, Databricks, and others. For steps to build\na sample custom connector, see\n[Develop a custom connector for metadata import](/dataplex/docs/develop-custom-connector).\nYou can also use the\n[community-contributed custom connectors](#community-contributed-connectors)\nthat are available for a variety of third-party sources.\n\nFor steps to run a managed connectivity pipeline, see\n[Import metadata from a custom source using Workflows](/dataplex/docs/import-using-workflows-custom-source).\n\nHow managed connectivity works\n------------------------------\n\nThe following diagram shows a managed connectivity pipeline.\n\nAt a high level, here's how managed connectivity works:\n\n1. You\n [build a connector for your data source](/dataplex/docs/develop-custom-connector).\n\n The connector must be an Artifact Registry image that can run on\n Dataproc Serverless.\n2. You [run the managed connectivity pipeline](/dataplex/docs/import-using-workflows-custom-source)\n in Workflows, an orchestration platform.\n\n3. The managed connectivity pipeline does the following things:\n\n 1. Creates a target entry group based on your configuration, if the entry group doesn't exist yet.\n 2. Runs the connector. The connector extracts the metadata from your data source and generates a metadata import file that can be imported into Dataplex Universal Catalog.\n 3. Monitors the progress of the metadata extraction.\n 4. Runs a metadata import job to import the metadata into Dataplex Universal Catalog.\n 5. Monitors the progress of the metadata import job.\n\nThe managed connectivity pipeline uses Dataproc Serverless to run the\nconnector, and Dataplex Universal Catalog metadata import API methods\nto run the metadata import job.\n\nThe metadata that you import consists of Dataplex Universal Catalog\nentries and their aspects. For more information about\nDataplex Universal Catalog metadata, see\n[About metadata management in Dataplex Universal Catalog](/dataplex/docs/catalog-overview).\n\nCommunity-contributed custom connectors\n---------------------------------------\n\nTo import metadata from third-party sources, you can use custom connectors that\nare contributed by the community. See each connector's README file for setup\ninstructions and more information about the connector.\n| **Note:** These connectors are not officially supported by Google.\n\nWhat's next\n-----------\n\n- [Import metadata from a custom source using Workflows](/dataplex/docs/import-using-workflows-custom-source)\n- [Develop a custom connector for metadata import](/dataplex/docs/develop-custom-connector)\n- [Import metadata using a custom pipeline](/dataplex/docs/import-metadata)"]]