Prerequisites
This page describes the prerequisites before deploying the Data Foundation for Cortex Framework. Cortex Data Foundation, built on BigQuery, stores data for analysis and AI development, ensuring proper storage and organization.
Take some time to watch the video on this page and think about your business goals, the data you want to analyze, and how Cortex can help you achieve your objectives. Familiarize yourself with the following resources before diving into the deployment process.
Understand your business needs
Understand your business requirements and the technical components of the Cortex Framework for a successful deployment. Consider your company's business rules and requirements and actual data sources it is using.
Review available workloads and data sources
View compatible workloads and data sources that Cortex Framework supports. Become familiar with the type of data you want to work with (for example, data from Salesforce, SAP, or other data source) in Data Sources and workloads.
Brush up on basic knowledge
Get familiar with some key Google Cloud tools:
- Google Cloud console
- Cloud Shell
- Cloud Shell Editor
- BigQuery
- Cloud Build
- Identity and Access Management
- Cloud Composer
- Apache Airflow
- Dataflow
Learn about Change Data Capture (CDC) for tracking updates to your data, ensuring your analysis is always based on the latest information. Find more info in the CDC guide.
Explore the repository
The prerequisites outlined in this page are specifically designed for deploying Cortex Data Foundation from the official GitHub repository. This repository contains essential resources to deploy Cortex Framework, including configuration files, Entity Relationship Diagrams (ERD), and predefined data models.
Next steps
After you cover these prerequisites, you are ready to move on to the actual deployment steps:
- Establish workloads.
- Clone repository.
- Determine integration mechanism.
- Set up components.
- Configure deployment.
- Execute deployment.