-
Hi, is there a way to turn off MMAP? I'm testing local-ai on AWS Lambda. The process crashes when loading the model. Lambda does not support MMAP. Is there any configuration to turn off it? |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
Hmm, I turned MMAP off in the source code. But still get error. Any idea about the "SIGILL: illegal instruction"?
|
Beta Was this translation helpful? Give feedback.
-
The same model loads successfully with |
Beta Was this translation helpful? Give feedback.
-
Hmm, these instructions could be AVX-512 SMID instructions. Lambda x86_64 only supports AVX and AVX2. Is there a way to complie llama.cpp only for AVX2? |
Beta Was this translation helpful? Give feedback.
-
I found a workaround: compiling Local-ai on c4 instances, which does not have AVX-512 support. Again, is there any configuration to turn off MMAP, so that I don't know to modify the code and recompile it? |
Beta Was this translation helpful? Give feedback.
-
I found out how to disable MMAP without changing the code. In the configuration file, you just specify name: gpt-3.5-turbo
# Default model parameters
parameters:
# Relative to the models path
model: WizardLM-7B-uncensored.ggmlv3.q4_K_M.bin
repeat_penalty: 1.2
temperature: 0.2
mmap: false
mmlock: false
|
Beta Was this translation helpful? Give feedback.
I found out how to disable MMAP without changing the code. In the configuration file, you just specify
mmap: false
.