This readme provides overview of the Software Development Kit (SDK) under development for integrating Clarifai with Databricks. The primary use case for this SDK is to facilitate the interaction between Databricks and Clarifai for tasks related to uploading client datasets, annotating data, and exporting and storing annotations in Spark DataFrames or Delta tables.
The initial use case for this SDK revolves around three main objectives:
The SDK should enable the seamless upload of datasets into the Clarifai application, simplifying the process of data transfer from Databricks to Clarifai.
It should provide features for data annotation, making it easier for users to add labels and metadata to their datasets within the Clarifai platform.
The SDK should offer functionality to export annotations and store them in Spark DataFrames or Delta tables, facilitating further data analysis within Databricks.
- Databricks : Runtime 13.3 or later
- Clarifai :
pip install clarifai
- Create your Clarifai account
- Follow the instructions to get your own Clarifai PAT
- Protocol Buffers : version 4.24.2
pip install protobuf==4.24.2
Install the package and initialize the clarifaipyspark class to begin.
pip install clarifai-pyspark
from clarifaipyspark.client import ClarifaiPySpark
Create a Clarifai-PySpark client object to connect to your app on Clarifai. You can also choose the dataset or create one in your clarifai app to upload the data.
claps_obj = ClarifaiPySpark(user_id=USER_ID, app_id=APP_ID, pat=CLARIFAI_PAT)
dataset_obj = claps_obj.dataset(dataset_id=DATASET_ID)
Checkout these notebooks for various operations you can perform using clarifai-pyspark SDK.
If you want to enhance your AI journey with workflows and leveraging custom models (programmatically) our Clarifai SDK might be good place to start with. Please refer below resources for further references.
- Docs - Clarifai Docs
- Explore our community page - Clarifai Community
- Fork and contribute to our SDK here !
- Reach out to us on socials