Skip to content

karthicksakkaravarti/coder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Coeder

This project implements an AI-powered Coeder using PyTorch and the Transformer architecture. It takes natural language instructions and generates corresponding HTML code.

Table of Contents

Features

  • Generates HTML code from natural language instructions
  • Uses BERT for encoding input instructions
  • Implements a Transformer decoder for HTML generation
  • Customizable model architecture and training parameters
  • Includes scripts for data generation, training, and inference

Requirements

  • Python 3.7+
  • PyTorch 1.9+
  • Transformers 4.5+
  • pandas
  • tqdm
  • PyYAML

Installation

  1. Clone the repository:

    git clone https://github.com/karthicksakkaravarti/coder.git
    cd html-generator
    
  2. Create a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows, use `venv\Scripts\activate`
    
  3. Install the required packages:

    pip install -r requirements.txt
    

Project Structure

html_generator/
│
├── src/
│   ├── __init__.py
│   ├── model.py
│   ├── dataset.py
│   └── utils.py
│
├── scripts/
│   ├── create_sample_data.py
│   ├── train.py
│   └── inference.py
│
├── configs/
│   └── model_config.yaml
│
├── data/
│   └── processed/
│       └── html_dataset.csv
│
├── models/
│   └── saved_models/
│
├── requirements.txt
└── README.md

Usage

Data Generation

To create sample data for training:

python -m scripts.create_sample_data

This will generate a CSV file with sample instructions and corresponding HTML code.

Training

To train the model:

python -m scripts.train

This script will load the data, train the model, and save checkpoints.

Inference

To generate HTML from instructions using a trained model:

python -m scripts.inference

Configuration

Model and training parameters can be configured in configs/model_config.yaml. Key parameters include:

  • vocab_size: Size of the vocabulary
  • d_model: Dimension of the model
  • nhead: Number of heads in multi-head attention
  • num_decoder_layers: Number of decoder layers
  • batch_size: Batch size for training
  • learning_rate: Learning rate for optimization
  • num_epochs: Number of training epochs

Contributing

Contributions to improve the Coeder are welcome. Please follow these steps:

  1. Fork the repository
  2. Create a new branch (git checkout -b feature/your-feature-name)
  3. Make your changes
  4. Commit your changes (git commit -am 'Add some feature')
  5. Push to the branch (git push origin feature/your-feature-name)
  6. Create a new Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Simple NLP based code generator

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages