Create a managed notebooks instance with a custom container

This page shows you how to add a custom container to a Vertex AI Workbench managed notebooks instance as a kernel that you can run your notebook files on.

Overview

You can add a custom container for use with your managed notebooks instance. The custom container is then available as a local kernel that you can run your notebook file on.

Custom container requirements

Vertex AI Workbench managed notebooks supports any of the current Deep Learning Containers container images.

To create a custom container image of your own, you can modify one of the Deep Learning Containers container images to create a derivative container image.

To create a custom container image from scratch, make sure the container image meets the following requirements:

  • Use a Docker container image with at least one valid Jupyter kernelspec. This exposed kernelspec lets Vertex AI Workbench managed notebooks load the container image as a kernel. If your container image includes an installation of JupyterLab or Jupyter Notebook, the installation will include the kernelspec by default. If your container image doesn't have the kernelspec, you can install the kernelspec directly.

  • The Docker container image must support sleep infinity.

  • To use your custom container with the managed notebooks executor, ensure that your custom container has the nbexecutor extension.

The following example Dockerfile text builds a custom Docker image from scratch that is based on an Ubuntu image and includes the latest Python version.

FROM --platform=linux/amd64 ubuntu:22.04

RUN apt-get -y update
RUN apt-get install -y --no-install-recommends \
python3-pip \
pipx \
git \
make \
jq

RUN pip install \
argcomplete>=1.9.4 \
poetry==1.1.14 \
jupyterlab==3.3.0

# Create a link that points to the right python bin directory
RUN ln -s /usr/bin/pythonVERSION_NUMBER /usr/bin/python

Replace VERSION_NUMBER with the version of Python that you're using.

How a custom container becomes a kernel in managed notebooks

For each custom container image provided, your managed notebooks instance identifies the available Jupyter kernelspec on the container image when the instance starts. The kernelspec appears as a local kernel in the JupyterLab interface. When the kernelspec is selected, the managed notebooks kernel manager runs the custom container as a kernel and starts a Jupyter session on that kernel.

How custom container kernels are updated

Vertex AI Workbench pulls the latest container image for your kernel:

  • When you create your instance.

  • When you upgrade your instance.

  • When you start your instance.

The custom container kernel doesn't persist when your instance is stopped, so each time your instance is started, Vertex AI Workbench pulls the latest version of the container image.

If your instance is running when a new version of a container is released, your instance's kernel isn't updated until you stop and start your instance.

Custom container image availability

Deep Learning Containers container images are available to all users. When you use a Deep Learning Containers container image, you must grant specific roles to your instance's service account so your instance can load the Deep Learning Containers container image as a kernel. Learn more about the required permissions and how to grant them in the Permissions section.

If you want to use your own custom container image, it must be located in Artifact Registry and the container image must be publicly available.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Enable the Notebooks and Artifact Registry APIs.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Make sure that billing is enabled for your Google Cloud project.

  7. Enable the Notebooks and Artifact Registry APIs.

    Enable the APIs

Add a custom container while creating an instance

To add a custom container to a managed notebooks instance, the custom container image must be specified at instance creation.

To add a custom container while you create a managed notebooks instance, complete the following steps.

  1. In the Google Cloud console, go to the Managed notebooks page.

    Go to Managed notebooks

  2. Click  Create new.

  3. In the Name field, enter a name for your instance.

  4. Click the Region list, and select a region for your instance.

  5. In the Environment section, select Provide custom Docker images.

  6. Add a Docker container image in one of the following ways:

    • Enter a Docker container image path. For example, to use a TensorFlow 2.12 container image with accelerators from Deep Learning Containers, enter us-docker.pkg.dev/deeplearning-platform-release/gcr.io/tf-cpu.2-12.py310.
    • Click Select to add a Docker container image from Artifact Registry. Then on the Artifact Registry tab where your container image is stored, change the project to the project that includes your container image, and select your container image.
  7. Complete the rest of the Create instance dialog according to your needs.

  8. Click Create.

  9. Vertex AI Workbench automatically starts the instance. When the instance is ready to use, Vertex AI Workbench activates an Open JupyterLab link.

Grant permissions to Deep Learning Containers container images

If you aren't using a Deep Learning Containers container image, skip this section.

To ensure that your instance's service account has the necessary permissions to load a Deep Learning Containers container image from Artifact Registry, ask your administrator to grant your instance's service account the following IAM roles on your instance:

For more information about granting roles, see Manage access to projects, folders, and organizations.

Your administrator might also be able to give your instance's service account the required permissions through custom roles or other predefined roles.

Set up a notebook file to run in your custom container

To open JupyterLab, create a new notebook file, and set it up to run on your custom container's kernel, complete the following steps.

  1. Next to your managed notebooks instance's name, click Open JupyterLab.

  2. In the Authenticate your managed notebook dialog, click the button to get an authentication code.

  3. Choose an account and click Allow. Copy the authentication code.

  4. In the Authenticate your managed notebook dialog, paste the authentication code, and then click Authenticate.

    Your managed notebooks instance opens JupyterLab.

  5. Select File > New > Notebook.

  6. In the Select kernel dialog, select the kernel for the custom container image that you want to use, and then click Select. Larger container images may take some time to appear as a kernel. If the kernel that you want isn't there yet, try again in a few minutes. You can change the kernel whenever you want to run your notebook file on a different kernel.

    Your new notebook file opens.

What's next