Skip to content

Commit

Permalink
Use container right now
Browse files Browse the repository at this point in the history
ramalama run/serve right now require the container, it has the version
of llama.cpp that works.

Long-term we may be able to remove this.

Signed-off-by: Eric Curtin <ecurtin@redhat.com>
  • Loading branch information
ericcurtin committed Jul 31, 2024
1 parent 3776c27 commit d40b962
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 0 deletions.
5 changes: 5 additions & 0 deletions ci.sh
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,11 @@ main() {
./ramalama list | grep tinyllama
./ramalama list | grep tiny-vicuna-1b
./ramalama list | grep NAME

if [ "$os" = "Linux" ]; then # no macos support for running/serving models yet
timeout 8 ./ramalama serve granite-code | grep -m1 -i listen
fi

# ramalama list | grep granite-code
# ramalama rm granite-code
}
Expand Down
5 changes: 5 additions & 0 deletions ramalama
Original file line number Diff line number Diff line change
Expand Up @@ -342,6 +342,11 @@ def main(args):
conman = select_container_manager()
ramalama_store = get_ramalama_store()

if conman:
conman_args = [conman, "run", "--rm", "-it", "--security-opt=label=disable", f"-v{ramalama_store}:/var/lib/ramalama", f"-v{os.path.expanduser('~')}:{os.path.expanduser('~')}", "-v/tmp:/tmp",
f"-v{__file__}:{__file__}", "quay.io/ramalama/ramalama:latest", __file__] + args
os.execvp(conman, conman_args)

try:
cmd = args.pop(0)
funcDict[cmd](ramalama_store, args)
Expand Down

0 comments on commit d40b962

Please sign in to comment.