Skip to content

Commit

Permalink
Simplify the python dependency installation
Browse files Browse the repository at this point in the history
  • Loading branch information
countzero committed Jun 13, 2024
1 parent f699bdc commit dd6d6db
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 13 deletions.
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,12 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [1.20.0] - 2024-06-13

### Changed
- [Build] Simplify the python dependency installation
- [Build] Downgrade the "torch" package to 2.1.2+cu121

## [1.19.0] - 2024-06-13

### Added
Expand Down
17 changes: 9 additions & 8 deletions rebuild_llama.cpp.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -189,17 +189,18 @@ cmake `

Copy-Item -Path "../../OpenBLAS/bin/libopenblas.dll" -Destination "./bin/Release/libopenblas.dll"

Set-Location -Path "../"
Set-Location -Path "../../../"

conda activate llama.cpp

# We are installing the latest available version of the dependencies.
pip install --upgrade --upgrade-strategy "eager" -r ./requirements.txt
Write-Host "[Python] Installing dependencies..." -ForegroundColor "Yellow"

Set-Location -Path "../../"
conda activate llama.cpp

# We are enforcing specific versions on some packages.
pip install -r ./requirements_override.txt
# We are installing the latest available version of all llama.cpp
# project dependencies and also overriding some package versions.
pip install `
--upgrade `
--upgrade-strategy "eager" `
--requirement ./requirements_override.txt

conda list

Expand Down
11 changes: 7 additions & 4 deletions requirements_override.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
# We are using a specific version of the "torch"
# package which supports a specific CUDA version.
--extra-index-url https://download.pytorch.org/whl/nightly/cu121
torch==2.4.0.dev20240516+cu121
# We are importing the llama.cpp project dependencies.
--requirement ./vendor/llama.cpp/requirements.txt

# We are overriding the "torch" package version with a
# specific compatible version that also supports CUDA.
--extra-index-url https://download.pytorch.org/whl/cu121
torch==2.1.2+cu121
2 changes: 1 addition & 1 deletion vendor/llama.cpp
Submodule llama.cpp updated 2 files
+0 −3 README.md
+3 −3 ggml-rpc.cpp

0 comments on commit dd6d6db

Please sign in to comment.