Developer Docs | Features | Usage | Architecture | Funders and Partners
Ask A Question is a free and open-source tool created to help non-profit organizations, governments in developing nations, and social sector organizations use Large Language Models for responding to citizen inquiries in their native languages.
Match your questions to content in the database using embeddings from LLMs.
Craft a custom reponse to the question using LLM chat and the content in your database
Connect to your own chatbot on platforms like Turn.io, Glific, and Typebot using our APIs.
Use the AAQ App to add, edit, and delete content in the database (Sign up for a demo here)
Identify urgent or important messages based on your own criteria.
See which content is the most sought after, the kinds of questions that receive poor feedback, identify missing content, and more
Refine or clarify your question through conversation
Respond with not just text but voice, images, and videos as well.
Monitor uptime, response rates, throughput HTTP reponse codes and more
Note
Looking for other features? Please raise an issue with [FEATURE REQUEST]
before the title.
To get answers from your database of contents, you can use the /search
endpoint. This endpoint returns the following:
- Search results: Finds the most similar content in the database using cosine distance between embeddings.
- (Optionally) LLM generated response: Crafts a custom response using LLM chat using the most similar content.
See docs or API docs for more details and other API endpoints.
curl -X 'POST' \
'https://[DOMAIN]/api/search' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <BEARER TOKEN>' \
-H 'Content-Type: application/json' \
-d '{
"query_text": "how are you?",
"generate_llm_response": false,
"query_metadata": {}
}'
The query looks the same as above, except generate_llm_response
is set to true
:
curl -X 'POST' \
'https://[DOMAIN]/api/search' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <BEARER TOKEN>' \
-H 'Content-Type: application/json' \
-d '{
"query_text": "this is my question",
"generate_llm_response": true,
"query_metadata": {}
}'
You can access the admin console at
https://[DOMAIN]/
We use docker-compose to orchestrate containers with a reverse proxy that manages all incoming traffic to the service. The database and LiteLLM proxy are only accessed by the core app.
See here for full documentation.