A very simple one-file playground inpired on Open-AI playground for llama.cpp http server usage. More information about server.cpp: https://github.com/ggerganov/llama.cpp/tree/master/examples/server
Edit the vite.config.ts and change the "target" value for the llama.cpp server ip.
npm install
npm run dev
npm run build
It will generate a simple html that can be execute with the browser without using a server (except the llama serve.cpp runing).