Data and AI Assets Catalog and Execution Engine
Allows upload, registration, execution, and deployment of:
- AI pipelines and pipeline components
- Models
- Datasets
- Notebooks
For more details about the project please follow this announcement blog post.
Additionally it provides:
- Automated sample pipeline code generation to execute registered models, datasets and notebooks
- Pipelines engine powered by Kubeflow Pipelines on Tekton, core of Watson AI Pipelines
- Components registry for Kubeflow Pipelines
- Datasets management by Datashim
- Preregistered Datasets from Data Asset Exchange (DAX) and Models from Model Asset Exchange (MAX)
- Serving engine by KFServing
- Model Metadata schemas
For a simple up-and-running MLX with asset catalog only, we created a Quickstart Guide using Docker Compose.
For a full deployment, we use Kubeflow Kfctl tooling.
- By default the MLX UI is available at :30380/os
To find the public ip of a node of your cluster
kubectl get node -o wide
Look for the ExternalIP column.
- If you are on a openshift cluster you can also make use of the IstioIngresGateway Route. You can find it in the OpenShift Console or in the CLI
oc get route -n istio-system
Import data and AI assets using MLX's catalog importer
MLX Troubleshooting Instructions
- Slack: @lfaifoundation/ml-exchange
- Mailing lists:
- MLX-Announce for top-level milestone messages and announcements
- MLX-TSC for top-level governance discussions and decissions
- MLX-Technical-Discuss for technical discussions and questions