If you find this project helpful, please consider giving ZenML a star on GitHub. Your support helps promote the project and lets others know it's worth checking out.
Thank you for your support! 🌟
This Terraform module sets up the necessary GCP infrastructure for a ZenML stack. It provisions various GCP services and resources, and registers a ZenML stack using these resources with your ZenML server, allowing you to create an internal MLOps platform for your entire machine learning team.
- Terraform installed (version >= 1.9")
- GCP account set up
- To authenticate with GCP, you need to have the
gcloud
CLI installed on your machine and you need to have rungcloud auth application-default login
to set up your credentials. - You'll need a Zenml server (version >= 0.62.0) deployed in a remote setting where it can be accessed from GCP. You have the option to either self-host a ZenML server or register for a free ZenML Pro account. Once you have a ZenML Server set up, you also need to create a ZenML Service Account API key for your ZenML Server. You can do this by running the following command in a terminal where you have the ZenML CLI installed:
zenml service-account create <service-account-name>
- This Terraform module uses the ZenML Terraform provider. It is recommended to use environment variables to configure the ZenML Terraform provider with the API key and server URL. You can set the environment variables as follows:
export ZENML_SERVER_URL="https://your-zenml-server.com"
export ZENML_API_KEY="your-api-key"
The Terraform module in this repository creates the following resources in your GCP project:
- a GCS bucket
- a Google Artifact Registry
- a Cloud Composer environment (only if the
orchestrator
variable is set toairflow
) - a Service Account with the minimum necessary permissions to access the GCS bucket, the Google Artifact Registry and the GCP project to build and push container images with Google Cloud Build, store artifacts and run pipelines with Vertex AI, SkyPilot or GCP Cloud Composer.
- depending on the target ZenML Server capabilities, different authentication methods are used:
- for a self-hosted ZenML server, a Service Account Key is generated and shared with the ZenML server
- for a ZenML Pro account, GCP Workload Identity Federation is used to authenticate with the ZenML server, so that no sensitive credentials are shared with the ZenML server. For this, a GCP Workload Identity Pool and a GCP Workload Identity Provider are created and linked to the GCP Service Account. There's only one exception: when the SkyPilot orchestrator is used, this authentication method is not supported, so the Service Account Key is used instead.
The Terraform module automatically registers a fully functional GCP ZenML stack directly with your ZenML server. The ZenML stack is based on the provisioned GCP resources and is ready to be used to run machine learning pipelines.
The ZenML stack configuration is the following:
- an GCP Artifact Store linked to the GCS bucket via an AWS Service Connector configured with IAM role credentials
- an GCP Container Registry linked to the Google Artifact Registry via an AWS Service Connector configured with IAM role credentials
- depending on the
orchestrator
input variable:
- if
orchestrator
is set tolocal
: a local Orchestrator. This can be used in combination with the Vertex AI Step Operator to selectively run some steps locally and some on Vertex AI. - if
orchestrator
is set tovertex
(default): a Vertex AI Orchestrator linked to the GCP project via an AWS Service Connector configured with IAM role credentials - if
orchestrator
is set toskypilot
: a SkyPilot Orchestrator linked to the GCP project via an AWS Service Connector configured with IAM role credentials - if
orchestrator
is set toairflow
: an Airflow Orchestrator linked to the Cloud Composer environment
- a Google Cloud Build Image Builder linked to the GCP project via an AWS Service Connector configured with IAM role credentials
- a Vertex AI Step Operator linked to the GCP project via an AWS Service Connector configured with IAM role credentials
To use the ZenML stack, you will need to install the required integrations:
- for Vertex AI:
zenml integration install gcp
- for SkyPilot:
zenml integration install gcp skypilot_gcp
- for Airflow:
zenml integration install gcp airflow
To use this module, aside from the prerequisites mentioned above, you also need to create a ZenML Service Account API key for your ZenML Server. You can do this by running the following command in a terminal where you have the ZenML CLI installed:
zenml service-account create <service-account-name>
terraform {
required_providers {
google = {
source = "hashicorp/google"
}
zenml = {
source = "zenml-io/zenml"
}
}
}
provider "google" {
region = "europe-west3"
project = "my-project"
}
provider "zenml" {
# server_url = <taken from the ZENML_SERVER_URL environment variable if not set here>
# api_key = <taken from the ZENML_API_KEY environment variable if not set here>
}
module "zenml_stack" {
source = "zenml-io/zenml-stack/gcp"
orchestrator = "vertex" # or "skypilot", "airflow" or "local"
zenml_stack_name = "my-zenml-stack"
}
output "zenml_stack_id" {
value = module.zenml_stack.zenml_stack.id
}
output "zenml_stack_name" {
value = module.zenml_stack.zenml_stack.name
}
ZenML Documentation ZenML Starter Guide ZenML Examples ZenML Blog
If you need assistance, join our Slack community or open an issue on our GitHub repo.
Features · Roadmap · Report Bug · Sign up for ZenML Pro · Read Blog · Contribute to Open Source · Projects Showcase