This document shows you how to use the MCP Toolbox for Databases to connect your Dataplex Universal Catalog to a variety of integrated development environments (IDEs) and developer tools that support Model Context Protocol (MCP). You can then use AI agents in your existing tools to discover data assets in Dataplex Universal Catalog.
MCP is an open protocol for connecting large language models (LLMs) to data sources such as Dataplex Universal Catalog. For more information about MCP, see Introduction to Model Context Protocol.
This guide demonstrates the connection process for the following IDEs:
- Gemini CLI
- Gemini Code Assist
- Claude code
- Claude desktop
- Cline (VS Code extension)
- Cursor
- Visual Studio Code (Copilot)
- Windsurf (formerly Codeium)
Before you begin
-
In the Google Cloud console, go to the project selector page.
-
Select or create a Google Cloud project.
-
Verify that billing is enabled for your Google Cloud project.
-
If you're using a local shell, then create local authentication credentials for your user account:
gcloud auth application-default login
You don't need to do this if you're using Cloud Shell.
If an authentication error is returned, and you are using an external identity provider (IdP), confirm that you have signed in to the gcloud CLI with your federated identity.
Install the MCP Toolbox
Download the latest version of the MCP Toolbox as a binary. Select the binary that corresponds to your (OS) and CPU architecture. You must use MCP Toolbox v0.10.0 or later.
Linux/amd64
curl -O https://storage.googleapis.com/genai-toolbox/VERSION/linux/amd64/toolbox
Replace
VERSION
with the MCP Toolbox version—for example,v0.10.0
.macOS (Darwin)/arm64
curl -O https://storage.googleapis.com/genai-toolbox/VERSION/darwin/arm64/toolbox
Replace
VERSION
with the MCP Toolbox version—for example,v0.10.0
.macOS (Darwin)/amd64
curl -O https://storage.googleapis.com/genai-toolbox/VERSION/darwin/amd64/toolbox
Replace
VERSION
with the MCP Toolbox version—for example,v0.10.0
.Windows/amd64
curl -O https://storage.googleapis.com/genai-toolbox/VERSION/windows/amd64/toolbox
Replace
VERSION
with the MCP Toolbox version—for example,v0.10.0
.Make the binary executable:
chmod +x toolbox
Verify the installation:
./toolbox --version
A successful installation returns the version number, for example, 0.10.0.
Configure the MCP client
Gemini CLI
- Install the Gemini CLI.
- In your working directory, create a folder named
.gemini
. Within that, create asettings.json
file. - Add the following configuration, replace the environment variables with
your values, and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Gemini Code Assist
- In VS Code, install the Gemini Code Assist extension.
- Enable Agent Mode in Gemini Code Assist chat.
- In your working directory, create a folder named
.gemini
. Within that, create asettings.json
file. - Add the following configuration, replace the environment variables with
your values, and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Claude code
- Install Claude Code.
- Create
.mcp.json
file in your project root, if it doesn't exist. - Add the configuration, replace the environment variables with your values, and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Claude desktop
- Open Claude Desktop and navigate to Settings.
- To open the configuration file, in the Developer tab, click Edit config.
- Add the configuration, replace the environment variables with your values,
and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
- Restart Claude desktop.
The new chat screen displays an MCP icon with the new MCP server.
Cline
- In VS Code, open the Cline extension and then click the MCP Servers icon.
- To open the configuration file, tap Configure MCP Servers.
- Add the following configuration, replace the environment variables with
your values, and save:
A green active status appears after the server connects successfully.{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Cursor
- Create the
.cursor
directory in your project root if it doesn't exist. - Create the
.cursor/mcp.json
file if it doesn't exist and open it. - Add the following configuration, replace the environment variables with
your values, and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
- Open Cursor and navigate to Settings>Cursor Settings > MCP. A green active status appears when the server connects.
VS Code (Copilot)
- Open VS Code
and create
.vscode
directory in your project root if it doesn't exist. - Create the
.vscode/mcp.json
file if it doesn't exist, and open it. - Add the following configuration, replace the environment variables with
your values, and save:
{ "servers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Windsurf
- Open Windsurf and navigate to Cascade assistant.
- To open the configuration file, click the MCP icon, then click Configure.
- Add the following configuration, replace the environment variables with
your values, and save:
{ "mcpServers": { "dataplex": { "command": "./PATH/TO/toolbox", "args": ["--prebuilt","dataplex","--stdio"], "env": { "DATAPLEX_PROJECT": "PROJECT_ID" } } } }
Use the tools
Your AI tool is now connected to Dataplex Universal Catalog using MCP. Try asking your AI assistant to find some data assets such such as BigQuery datasets, Cloud SQL instances, and others.
The following tool is available to the LLM:
- dataplex_search_entries: search for resources
Optional: Add system instructions
System instructions are a way to provide specific guidelines to the LLM, helping it to understand the context and respond more accurately. Set up system instructions based on the recommended system prompt.
For more information about how to configure instructions, see Use instructions to get AI edits that follow your coding style.