diff --git a/README.md b/README.md index b6fcca1..f2ff91f 100644 --- a/README.md +++ b/README.md @@ -49,12 +49,12 @@ $env:API_KEY="your_api_key"" ### 🔧 Configuration `codai` requires a `config.yml` file in the `root of your working directory` or using `environment variables` to set below configs `globally` as a configuration. -By default codai config works with `openai` provider and the `config.yml` contains the following values: +The `config` file should be like following example base on your `AI provider`: -**config.yml - openai sample** +**config.yml** ```yml ai_provider_config: - provider_name: "openai" + provider_name: "openai" # openai | ollama chat_completion_url: "https://api.openai.com/v1/chat/completions" chat_completion_model: "gpt-4o" embedding_url: "https://api.openai.com/v1/embeddings" #(Optional, If you want use RAG.) @@ -65,23 +65,7 @@ theme: "dracula" rag: true #(Optional, If you want use RAG.) ``` -Also, to provide the config for the `ollama` provider and the `config.yml` contains the following values: - -**config.yml - ollama sample** - -```yml -ai_provider_config: - provider_name: "ollama" - chat_completion_url: "http://localhost:11434/v1/chat/completions" - chat_completion_model: "llama3.1" - embedding_url: "http://localhost:11434/v1/embeddings" #(Optional, If you want use RAG.) - embedding_model: "all-minilm:l6-v2" #(Optional, If you want use RAG.) - temperature: 0.2 - threshold: 0.3 #(Optional, If you want use RAG.) -theme: "dracula" -rag: true #(Optional, If you want use RAG.) -``` - +> Note: For `ollama` provider use base url `http://localhost:11434` ollama for chat, embeddings url and also use your `desire models` ollama for chat, embeddings model. > Note: We used the standard integration of [OpenAI APIs](https://platform.openai.com/docs/api-reference/introduction) and [Ollama APIs](https://github.com/ollama/ollama/blob/main/docs/api.md) and you can find more details in documentation of each APIs. If you wish to customize your configuration, you can create your own `config.yml` file and place it in the `root directory` of `each project` you want to analyze with codai. If `no configuration` file is provided, codai will use the `default settings`.