Skip to content

A CLI to manage install and configure llama inference implemenation in multiple languages

Notifications You must be signed in to change notification settings

mikepapadim/llama-shepherd-cli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llama2-shepherd

Llama Shepherd is a command-line tool for quickly managing and experimenting with multiple versions of llama inference implementations. Originating from llama2.c project by Andrej Karpathy.


# Language Name Github Author
1. Rust llama2.rs https://github.com/gaxler/llama2.rs @gaxler
2. Rust llama2.rs https://github.com/leo-du/llama2.rs @leo-du
3. Rust llama2-rs https://github.com/danielgrittner/llama2-rs @danielgrittner
4. Rust llama2.rs https://github.com/lintian06/llama2.rs @lintian06
5. Rust pecca.rs https://github.com/rahoua/pecca-rs @rahoua
6. Rust llama2.rs https://github.com/flaneur2020/llama2.rs @flaneur2020
7. Go go-llama2 https://github.com/tmc/go-llama2 @tmc
8. Go llama2.go https://github.com/nikolaydubina/llama2.go @nikolaydubina
9. Go llama2.go https://github.com/haormj/llama2.go @haormj
10. Go llama2.go https://github.com/saracen/llama2.go @saracen
11. Android llama2.c-android https://github.com/Manuel030/llama2.c-android @Manuel030
12. Android llama2.c-android-wrapper https://github.com/celikin/llama2.c-android-wrapper @celikin
13. C++ llama2.cpp https://github.com/leloykun/llama2.cpp @leloykun
14. C++ llama2.cpp https://github.com/coldlarry/llama2.cpp @coldlarry
15. CUDA llama_cu_awq https://github.com/ankan-ban/llama_cu_awq @ankan-ban
16. JavaScript llama2.js https://github.com/epicure/llama2.js @epicure
17. JavaScript llamajs https://github.com/agershun/llamajs @agershun
18. JavaScript llama2.ts https://github.com/wizzard0/llama2.ts @oleksandr_now
19. JavaScript llama2.c-emscripten https://github.com/gohai/llama2.c-emscripten @gohai
20. Zig llama2.zig https://github.com/cgbur/llama2.zig @cgbur
21. Zig llama2.zig https://github.com/vodkaslime/llama2.zig @vodkaslime
22. Zig llama2.zig https://github.com/clebert/llama2.zig @clebert
23. Julia llama2.jl https://github.com/juvi21/llama2.jl @juvi21
24. Scala llama2.scala https://github.com/jrudolph/llama2.scala @jrudolph
25. Java llama2.java https://github.com/mukel/llama2.java @mukel
26. Java llama2.tornadovm.java https://github.com/mikepapadim/llama2.tornadovm.java @mikepapadim
27. Java Jlama https://github.com/tjake/Jlama @tjake
28. Java llama2j https://github.com/LastBotInc/llama2j @lasttero
29. Kotlin llama2.kt https://github.com/madroidmaq/llama2.kt @madroidmaq
30. Python llama2.py https://github.com/tairov/llama2.py @tairov
31. C# llama2.cs https://github.com/trrahul/llama2.cs @trrahul
32. Dart llama2.dart https://github.com/yiminghan/llama2.dart @yiminghan
33. Web llama2c-web https://github.com/dmarcos/llama2.c-web @dmarcos
34. WebAssembly icpp-llm https://github.com/icppWorld/icpp-llm N/A
35. Fortran llama2.f90 https://github.com/rbitr/llama2.f90 N/A
36. Mojo llama2.🔥 https://github.com/tairov/llama2.mojo @tairov
37. OCaml llama2.ml https://github.com/jackpeck/llama2.ml @jackpeck
38. Everywhere llama2.c https://github.com/trholding/llama2.c @trholding
39. Bilingual llama2.c-zh https://github.com/chenyangMl/llama2.c-zh @chenyangMl

How to use:

lshep

List Available Llama Options

To list available llama options, use the following command:

python3 llamashepherd/main.py list [Optional][LANGUAGE]

Replace [LANGUAGE] with the desired language to filter options. If not specified, all options will be displayed.

Interactively Install Llama Options

To interactively install llama options, use the following command:

python3 llamashepherd/main.py install 

Initialize TinyLlamas Models

To initialize llama models, use the following command:

python3 llamashepherd/main.py models 

This command allows you to download and configure the Tokenizer and/or TinyLLama models.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

A CLI to manage install and configure llama inference implemenation in multiple languages

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages