This project fine-tunes a BLOOM-based language model for generating marketing emails. It uses advanced NLP techniques and model compression to optimize efficiency and performance.
- Model Fine-tuning: Utilized bigscience/bloom-1b7 for language modeling.
- Dataset Preparation: Loaded synthetic data from AryanPrakhar/marketing_mail_data.
- Training: Configured batch size, learning rate, and gradient accumulation.
- Inference: Generated marketing emails for product descriptions.
- Model Deployment: Published model and adapters on Hugging Face Hub.