-
请问对chinese-alpaca2-1.3b进行增量预训练后得到的lora应该和原版哪个模型进行合并呢,也是Llama-2-7b-hf吗 |
Beta Was this translation helpful? Give feedback.
Answered by
ymcui
Feb 21, 2024
Replies: 1 comment 1 reply
-
你在alpaca-2-1.3b上训练LoRA就和alpaca-2-1.3b合并就行了。 |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
azyu11
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
你在alpaca-2-1.3b上训练LoRA就和alpaca-2-1.3b合并就行了。