Connect Confluence Data Center On-premises

This page describes how to create a Confluence Data Center data store and search app in Agentspace Enterprise, syncing on-premises Confluence data with Agentspace Enterprise.

After you set up your data source and import data the first time, you can choose how often the data store syncs with that source.

Before you begin

Before setting up your connection, do the following:

  1. Verify that you have the Confluence Administrator permission to fetch the Access Control List (ACL) information.

  2. Install the Permission Accessor for Confluence Data Center plugin. This plugin introduces REST endpoints that enable Google Agentspace to fetch space permissions, content restrictions, and licensed users' email addresses details to apply the correct permissions for the search experience in Google Agentspace.

  3. Make sure that you have the following details:

    • Service attachment (required for private destination type only): Configure a service attachment for secure data transfer.
    • Username and password: Obtain valid credentials for authentication from your Confluence administrator.
    • Domain URL: For a private destination type, specify the URL of the Confluence Data Center instance.
    • Optional: Base domain name: Provide the base domain name for the Confluence instance.
    • Optional: Destination port: Identify the port used for communication with the Confluence Data Center.
  4. Use the following configuration guidelines to establish connections with Private Service Connect (PSC). Adjust or add resources as needed. Make sure the PSC service attachment is properly configured to connect to the private instance and meets the requirements for a published service.

    1. Configure network settings:

      1. Place the PSC service attachment and load balancer in different subnets within the same Virtual Private Cloud network.

      2. The backend system must remain closed to the public network for security reasons. However, verify it can accept traffic from the following sources:

        • For proxy-based/HTTP(s) load balancers (L4 proxy ILB, L7 ILB), configure the backend to accept requests from the proxy subnet in the Virtual Private Cloud network.

        • For more information, see the Proxy-only subnets for Envoy-based load balancers documentation.

    2. Adjust firewall rules:

      1. Ingress rules:

        • Allow traffic from the PSC service attachment subnet to the internal load balancer (ILB) subnet.
        • Make sure that the ILB can send traffic to the backend.
        • Permit health check probes to reach the backend.
      2. Egress rules: Enable egress traffic by default, unless specific deny rules apply.

  5. Additional considerations: Make sure to keep all the components, including the PSC service attachment and load balancer, in the same region.

Generate a service attachment

Use the following steps to generate a service attachment:

  1. Decide endpoint type: Select Public or Private endpoint.

  2. For Public endpoint: If the Confluence Data Center Destination type is Public, you are not required to create the setup for service attachment. Instead, you can use your public URL in the Domain URL field of the Google Cloud console when creating your connector.

  3. For Private endpoint:

    1. Use Private Service Connect (PSC) to enable connections from private instances to Google Cloud.
    2. Create a Virtual Private Cloud network and required subnets.
    3. Create a Virtual Machine (VM) instance and install the backend service.
    4. Optional: Set up a health check probe to monitor backend health.
    5. Add a load balancer to route traffic to the VM or backend.
    6. Define firewall rules to allow traffic between the PSC endpoint and the backend.
    7. Publish the endpoint by creating a PSC service attachment.

Create a Confluence Data Center user and set up permissions

To enable Agentspace Enterprise to obtain data from Confluence, you need to create a new user and assign administrator permission to the user. This is because only Confluence administrators can view and manage permissions across all spaces.

  1. Sign in as an administrator:

    1. Go to your Atlassian domain site and open the Confluence Data Center instance.
    2. Enter the administrator username and password.
    3. Click Log In.
  2. Create a new user:

    When creating a data store, you must create a user to obtain data from the third-party instance.

    1. Click the settings icon.
    2. Select User management.
    3. Enter the administrator credentials, if prompted.
    4. In the Administration page, click Create user.
    5. Enter the email address, full name, username, and password.
    6. Click Create user.
  3. Configure user permissions:

    1. In the Confluence administration page, navigate to the Users and security tab and click Groups.
    2. Search for the confluence-administrators group and add the newly created user to this group.

Create a Confluence Data Center On-premises connector

Console

  1. In the Google Cloud console, go to the Agentspace Enterprise page.

    Agentspace Enterprise

  2. In the navigation menu, click Data stores.

  3. Click Create data store.

  4. On the Select a data source page, scroll or search for Confluence data center to connect your third-party source.

  5. Enter your authentication information and click Continue.

  6. From the Destination type drop-down list, select Public or Private.

    1. For Public destination type, you are not required to create the setup for service attachment. Instead, you can use your public URL in the Domain URL field of the Google Cloud console.

    2. For Private destination type, enter all the required information:

      1. If the region of the service attachment is different from the region of your data connector, select Enable PSC Global Access.
      2. For instance with the Domain URL:
        • Service attachment: Enter your service attachment.
        • Optional: Base domain name: Enter your base domain.
        • Domain URL: Enter your domain URL.
        • Optional: Destination port: Enter your destination port.
      3. For instance without Domain URL:
        • Service attachment: Enter your service attachment.
        • Optional: Destination port: Enter your destination port.
  7. Click Continue.

  8. Optional: Advanced options: Select and enable Proxy settings and SSL settings, if required.

  9. Under the Entities to sync, select all the required entities to sync and click Continue.

  10. Select the synchronization frequency for full sync, and optionally, for incremental sync. For more information, see Sync frequency.

    Data sync frequency settings.
    Sync frequency settings for full and incremental data sync.

    If you want to schedule separate full syncs of entity and identity data, expand the menu under Full sync and then select Custom options.

    Custom options for full data sync.
    Setting separate schedules for full entity sync and full identity sync.
  11. Select a region for your data connector and enter a name for your data connector.

For Private destination type, after you submit the details for the connector, VAIS sends a connection request to your PSC. Navigate to your connector to see a message to allowlist a projectId in the PSC. The connector remains in the Error state until you allow the connection in PSC. When you accept the connection request, the connector moves to the Active state during the next sync run. If you configure your PSC to accept all connections, the connector automatically moves to the Active state after creation.

For Public destination type, the connector automatically enters the Active state after submission.

To verify the state of the data store and the ingestion activity, do the following:

  1. Navigate to the connector in the data store list and monitor its state until it changes to Active.

  2. After the connector state changes to Active, click the required entity and confirm that all selected entities are ingested. The data store state transitions from Creating to Running when synchronization begins and changes to Active once ingestion completes, indicating that the data store is set up. Depending on the size of your data, ingestion can take several hours.

Next steps