A conversational chatbot for querying and retrieving data from openBIS instances using a Large Language Model (LLM). This chatbot allows users to interact with openBIS via natural language, making it easier to explore spaces, projects, experiments, and datasets.
- Natural language understanding using Llama 3.2.
- Integration with openBIS API for real-time data retrieval.
- Modular design for easy maintenance and scalability.
- Handles complex queries and provides friendly, human-like responses.
- Error handling for ambiguous or incomplete queries.
- Secure connection to openBIS with robust authentication.
- Python 3.8 or later.
- A functional openBIS instance.
- Access to the Llama 3.2 model via Ollama.
conda
orpip
installed on your system.
-
Clone the repository:
git clone https://github.com/your-repo/openbis-chatbot.git cd openbis-chatbot
-
Set up a virtual environment (recommended):
conda create -n chatbot python=3.8 conda activate chatbot
-
Install dependencies:
pip install -r requirements.txt
-
Set up configuration:
- Edit
config/settings.py
to include your openBIS URL, username, and password.
- Edit
-
Run the Chatbot:
python src/openbis_chatbot/main.py
-
Interact with the Chatbot:
- Example Queries:
- "List all projects under the space 'MY_SPACE'."
- "What datasets are available in 'Project A'?"
- "Give me all experiments under 'Space X'."
- Example Queries:
-
Exit the Chatbot:
- Type
exit
to close the chatbot.
- Type
openbis-chatbot/
├── .vscode/
│ └── settings.json
├── examples/
│ ├── example_input.json
│ └── example_output.json
├── src/
│ └── openbis_chatbot/
│ ├── utils/
│ │ ├── logger.py
│ │ └── validators.py
│ ├── prompts/
│ │ ├── base_prompt.txt
│ │ └── examples.txt
│ ├── __init__.py
│ ├── main.py
│ ├── data_retrieval.py
│ ├── llm_interface.py
│ ├── parser.py
│ └── prompt.py
├── tests/
│ ├── data/
│ │ └── # here you can put the files in examples/
│ ├── __init__.py
│ ├── test.py
│ ├── test_data_retrieval.py
│ ├── test_llm_interface.py
│ ├── test_parser.py
│ └── test_prompt.py
├── .gitignore
├── README.md
└── pyproject.toml
-
User Input:
- The user provides a natural language query.
- Example: "List all projects under 'MY_SPACE'."
-
LLM Interpretation:
- The LLM interprets the query and provides structured intent and entities.
- Output:
{"intent": "list_projects", "space": "MY_SPACE"}
-
Query Parsing:
- The query parser validates and extracts the intent and entities.
-
Data Retrieval:
- The system uses openBIS APIs to fetch relevant data.
-
Response Generation:
- The chatbot formats the data into a friendly response and displays it to the user.
To ensure the chatbot is functioning correctly, run the tests:
pytest tests/
pybis
: Library for interacting with openBIS.ollama-python
: Python client for the Llama model.httpx
: HTTP library for making API requests.
Install all dependencies using:
pip install -r requirements.txt
- Ambiguous queries may require follow-up questions for clarification.
- Large datasets may take longer to process.
- Add support for more openBIS operations (e.g., creating projects, deleting datasets).
- Enhance the LLM prompt for better context understanding.
- Integrate with communication platforms like Slack or Teams.
- Fork the repository.
- Create a feature branch:
git checkout -b feature-name
- Commit your changes and push:
git commit -m "Add feature name" git push origin feature-name
- Submit a pull request.
This project is licensed under the MIT License.
For questions or support, reach out to:
- Name: Carlos Madariaga
- Email: carlosmadariagaaramendi@gmail.com
- GitHub: Carlos Madariaga Github