Skip to content

Commit

Permalink
Merge branch 'main' into feat/WMS_V0_b
Browse files Browse the repository at this point in the history
  • Loading branch information
jacopo-chevallard committed Nov 22, 2024
2 parents fdbeaf1 + a4e42b0 commit 3e38c90
Show file tree
Hide file tree
Showing 48 changed files with 7,657 additions and 37 deletions.
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
"core": "0.0.23"
"core": "0.0.24"
}
15 changes: 15 additions & 0 deletions core/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,20 @@
# Changelog

## [0.0.24](https://github.com/QuivrHQ/quivr/compare/core-0.0.23...core-0.0.24) (2024-11-14)


### Features

* kms-migration ([#3446](https://github.com/QuivrHQ/quivr/issues/3446)) ([1356d87](https://github.com/QuivrHQ/quivr/commit/1356d87098ae84776a5d47b631d07a1c8e92e291))
* **megaparse:** add sdk ([#3462](https://github.com/QuivrHQ/quivr/issues/3462)) ([190d971](https://github.com/QuivrHQ/quivr/commit/190d971bd71333924b88ba747d3c6a833ca65d92))


### Bug Fixes

* added chunk_size in tika processor ([#3466](https://github.com/QuivrHQ/quivr/issues/3466)) ([063bbd3](https://github.com/QuivrHQ/quivr/commit/063bbd323dfca2dfc22fc5416c1617ed61d2e2ab))
* modify megaparse strategy ([#3474](https://github.com/QuivrHQ/quivr/issues/3474)) ([da97b2c](https://github.com/QuivrHQ/quivr/commit/da97b2cf145c86ed577be698ae837b3dc26f6921))
* supported extensions for megaparse ([#3477](https://github.com/QuivrHQ/quivr/issues/3477)) ([72b979d](https://github.com/QuivrHQ/quivr/commit/72b979d4e4d6e6efc45d47c7aba942eb909adc3e))

## [0.0.23](https://github.com/QuivrHQ/quivr/compare/core-0.0.22...core-0.0.23) (2024-10-31)


Expand Down
2 changes: 1 addition & 1 deletion core/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "quivr-core"
version = "0.0.23"
version = "0.0.24"
description = "Quivr core RAG package"
authors = [
{ name = "Stan Girard", email = "stan@quivr.app" }
Expand Down
3 changes: 2 additions & 1 deletion docs/docs/workflows/examples/rag_with_web_search.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,8 @@ Follow the instructions below to create the agentic RAG workflow shown above, wh
1. Add your API Keys to your environment variables
```python
import os
os.environ["OPENAI_API_KEY"] = "myopenai_apikey"
os.environ["OPENAI_API_KEY"] = "my_openai_api_key"
os.environ["TAVILY_API_KEY"] = "my_tavily_api_key"

```
Check our `.env.example` file to see the possible environment variables you can configure. Quivr supports APIs from Anthropic, OpenAI, and Mistral. It also supports local models using Ollama.
Expand Down
40 changes: 20 additions & 20 deletions examples/chatbot/.chainlit/config.toml
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,10 @@ edit_message = true

[UI]
# Name of the assistant.
name = "Assistant"
name = "Quivr"

# Description of the assistant. This is used for HTML tags.
# description = ""
description = "Demo of Quivr"

# Large size content are by default collapsed for a cleaner ui
default_collapse_content = true
Expand All @@ -65,11 +65,11 @@ default_collapse_content = true
cot = "full"

# Link to your github repo. This will add a github button in the UI's header.
# github = ""
github = "https://github.com/quivrhq/quivr"

# Specify a CSS file that can be used to customize the user interface.
# The CSS file can be served from the public directory or via an external link.
# custom_css = "/public/test.css"
# custom_css = "/public/custom.css"

# Specify a Javascript file that can be used to customize the user interface.
# The Javascript file can be served from the public directory.
Expand All @@ -88,33 +88,33 @@ cot = "full"

[UI.theme]
default = "dark"
#layout = "wide"
#font_family = "Inter, sans-serif"
font_family = "Tahoma,Verdana,Segoe,sans-serif"

# Override default MUI light theme. (Check theme.ts)
[UI.theme.light]
#background = "#FAFAFA"
#paper = "#FFFFFF"
background = "#fcfcfc"
paper = "#f8f8f8"

[UI.theme.light.primary]
#main = "#F80061"
#dark = "#980039"
#light = "#FFE7EB"
main = "#6142d4"
dark = "#6e53cf"
light = "#6e53cf30"
[UI.theme.light.text]
#primary = "#212121"
#secondary = "#616161"
primary = "#1f1f1f"
secondary = "#818080"

# Override default MUI dark theme. (Check theme.ts)
[UI.theme.dark]
#background = "#FAFAFA"
#paper = "#FFFFFF"
background = "#252525"
paper = "#1f1f1f"

[UI.theme.dark.primary]
#main = "#F80061"
#dark = "#980039"
#light = "#FFE7EB"
main = "#6142d4"
dark = "#6e53cf"
light = "#6e53cf30"
[UI.theme.dark.text]
#primary = "#EEEEEE"
#secondary = "#BDBDBD"
primary = "#f4f4f4"
secondary = "#c8c8c8"

[meta]
generated_by = "1.1.402"
1 change: 1 addition & 0 deletions examples/chatbot/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@ wheels/

# venv
.venv
.files
43 changes: 43 additions & 0 deletions examples/chatbot/basic_rag_workflow.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
workflow_config:
name: "standard RAG"
nodes:
- name: "START"
edges: ["filter_history"]

- name: "filter_history"
edges: ["rewrite"]

- name: "rewrite"
edges: ["retrieve"]

- name: "retrieve"
edges: ["generate_rag"]

- name: "generate_rag" # the name of the last node, from which we want to stream the answer to the user
edges: ["END"]
tools:
- name: "cited_answer"

# Maximum number of previous conversation iterations
# to include in the context of the answer
max_history: 10

# Reranker configuration
# reranker_config:
# # The reranker supplier to use
# supplier: "cohere"

# # The model to use for the reranker for the given supplier
# model: "rerank-multilingual-v3.0"

# # Number of chunks returned by the reranker
# top_n: 5

# Configuration for the LLM
llm_config:

# maximum number of tokens passed to the LLM to generate the answer
max_output_tokens: 4000

# temperature for the LLM
temperature: 0.7
28 changes: 24 additions & 4 deletions examples/chatbot/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import chainlit as cl
from quivr_core import Brain
from quivr_core.rag.entities.config import RetrievalConfig


@cl.on_chat_start
Expand All @@ -26,7 +27,7 @@ async def on_chat_start():
text = f.read()

with tempfile.NamedTemporaryFile(
mode="w", suffix=".txt", delete=False
mode="w", suffix=file.name, delete=False
) as temp_file:
temp_file.write(text)
temp_file.flush()
Expand All @@ -47,17 +48,36 @@ async def on_chat_start():
@cl.on_message
async def main(message: cl.Message):
brain = cl.user_session.get("brain") # type: Brain
path_config = "basic_rag_workflow.yaml"
retrieval_config = RetrievalConfig.from_yaml(path_config)

if brain is None:
await cl.Message(content="Please upload a file first.").send()
return

# Prepare the message for streaming
msg = cl.Message(content="")
msg = cl.Message(content="", elements=[])
await msg.send()

saved_sources = set()
saved_sources_complete = []
elements = []

# Use the ask_stream method for streaming responses
async for chunk in brain.ask_streaming(message.content):
async for chunk in brain.ask_streaming(message.content, retrieval_config=retrieval_config):
await msg.stream_token(chunk.answer)

for source in chunk.metadata.sources:
if source.page_content not in saved_sources:
saved_sources.add(source.page_content)
saved_sources_complete.append(source)
print(source)
elements.append(cl.Text(name=source.metadata["original_file_name"], content=source.page_content, display="side"))


await msg.send()
sources = ""
for source in saved_sources_complete:
sources += f"- {source.metadata['original_file_name']}\n"
msg.elements = elements
msg.content = msg.content + f"\n\nSources:\n{sources}"
await msg.update()
Binary file added examples/chatbot/public/favicon.ico
Binary file not shown.
Binary file added examples/chatbot/public/logo_dark.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/chatbot/public/logo_light.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 3e38c90

Please sign in to comment.