Skip to content

Commit

Permalink
fix: Instructor HuggingFaceEmbeddings with hkunlp/instructor-xl
Browse files Browse the repository at this point in the history
  • Loading branch information
fynnfluegge committed Mar 3, 2024
1 parent 428a7dc commit 526d50c
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 7 deletions.
18 changes: 14 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Built with [langchain](https://github.com/langchain-ai/langchain), [treesitter](
## ✨ Features

- 🔎  Semantic code search
- 💬  GPT-like chat with your codebase
- 💬  GPT-like chat with your codebase
- ⚙️  Synchronize vector store and latest code changes with ease
- 💻  100% local embeddings and llms
- sentence-transformers, instructor-embeddings, llama.cpp, Ollama
Expand Down Expand Up @@ -67,13 +67,13 @@ codeqai sync
```
codeqai app
```

<div align="center">

<img src="https://github.com/fynnfluegge/codeqai/assets/16321871/3a9105f1-066a-4cbd-a096-c8a7bd2068d3" width="800">

</div>


> [!NOTE]
> At first usage, the repository will be indexed with the configured embeddings model which might take a while.
Expand All @@ -82,19 +82,26 @@ codeqai app
- Python >=3.9,<3.12

## 📦 Installation

Install in an isolated environment with `pipx`:

```
pipx install codeqai
```

⚠ Make sure pipx is using Python >=3.9,<3.12.
To specify the Python version explicitly with pipx, activate the desired Python version (e.g. with `pyenv shell 3.X.X`) and intall with:

```
pipx install codeqai --python $(which python)
```

If you are still facing issues using pipx you can also install directly from source through PyPI with:

```
pip install codeqai
```

However, it is recommended to use pipx to benefit from isolated environments for the dependencies.
Visit the [Troubleshooting](https://github.com/fynnfluegge/codeqai?tab=readme-ov-file#-troubleshooting) section for solutions of known issues during installation.

Expand Down Expand Up @@ -151,6 +158,7 @@ export OPENAI_API_VERSION = "2023-05-15"
- [x] C++
- [x] C
- [x] C#
- [x] Ruby

## 💡 How it works

Expand Down Expand Up @@ -182,16 +190,19 @@ will download the `codellama-13b-python.Q5_K_M` model. After the download has fi
> `llama.cpp` compatible models must be in the `.gguf` format.
## 🛟 Troubleshooting

- ### During installation with `pipx`

```
pip failed to build package: tiktoken
Some possibly relevant errors from pip install:
error: subprocess-exited-with-error
error: can't find Rust compiler
```

Make sure the rust compiler is installed on your system from [here](https://www.rust-lang.org/tools/install).

- ### During installation of `faiss`
```
× Building wheel for faiss-cpu (pyproject.toml) did not run successfully.
Expand All @@ -206,7 +217,6 @@ will download the `codellama-13b-python.Q5_K_M` model. After the download has fi
```
Make sure to have codeqai installed with Python <3.12. There is no faiss wheel available yet for Python 3.12.


## 🌟 Contributing

If you are missing a feature or facing a bug don't hesitate to open an issue or raise a PR.
Expand Down
7 changes: 4 additions & 3 deletions codeqai/embeddings.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import inquirer
from langchain_community.embeddings import (HuggingFaceEmbeddings,
HuggingFaceInstructEmbeddings)
from langchain_community.embeddings import HuggingFaceEmbeddings
from langchain_openai import OpenAIEmbeddings

from codeqai import utils
Expand Down Expand Up @@ -39,7 +38,9 @@ def __init__(
except ImportError:
self._install_instructor_embedding()

self.embeddings = HuggingFaceEmbeddings()
self.embeddings = HuggingFaceEmbeddings(
model_name="hkunlp/instructor-xl"
)

def _install_sentence_transformers(self):
question = [
Expand Down

0 comments on commit 526d50c

Please sign in to comment.