能用Chinese-LLaMA-Alpaca-2 做基座,然后扩充蒙古文词表,蒙古文语料上增量训练,蒙汉平行语料指令微调实现蒙古文LLM吗? #535
Answered
by
ymcui
fandaoerji
asked this question in
Q&A
-
我想扩张蒙古文的功能, Chinese-LLaMA-Alpaca-2 的模型源码从哪儿看到? 比如像LLama2里的 llama/model.py 这样的源码怎么看不见? |
Beta Was this translation helpful? Give feedback.
Answered by
ymcui
Mar 7, 2024
Replies: 3 comments 3 replies
-
1)理论上可行。 |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
fandaoerji
-
如果 Chinese-LLaMA-Alpaca-2上用LoRA 继续预训练时 1个A40显卡够吗?假设语料只有10G。 |
Beta Was this translation helpful? Give feedback.
1 reply
-
这个如果都是蒙古文效果不好? 是不是 蒙、汉、英 都有效果才好一些?(不是平行翻译的蒙汉英) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
1)理论上可行。
2)wiki里有训练脚本,自行查看:https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/wiki