Skip to content

二次训练(continue training using another dataset)

RVC-Boss edited this page Sep 25, 2024 · 1 revision

1、假如我用训练集A基于底模训练了一波,然后基于训练集B想接着训练集A的结果再训练一波,如何操作?

1、Suppose I used training set A to train the model A based on the base model and then want to continue training the result from model A using training set B. What should I do?

(1)你不需要调整 预训练的G/D/GPT 这三个预训练模型的路径;

(1) You don’t need to adjust the paths for the pre-trained G/D/GPT models;

(2)在训练完A模型后,更改到新的实验名,对新的训练集重新做数据预处理;

(2) After completing the training of model A, use a new experiment name, and preprocess the data for the new training set again;

(3)把上次训练的logs/实验名/logs_s1和logs/实验名/logs_s2文件夹拷贝到新的logs/实验名文件夹下;

(3)Copy the logs/experiment_name/logs_s1 and logs/experiment_name/logs_s2 folders from the last training into the new logs/experiment_name folder;

(4)调整比上次更高的s1和s2的epoch训练轮数点击训练,会接着上次训练的A模型的检查点继续往下训练到更大的轮数。

(4)Set a higher number of epochs for s1 and s2 than in the previous training. By clicking training button, it will continue training from the last checkpoint of model A to a larger number of epochs.

2、如何解锁最大轮数?

2、How to unlock the maximum number of epochs?

(1)除非的训练集非常多的情况下才会需要用到更大的轮数,如果训练集比较少,训练比默认上限更大的轮数效果可能会下降;

(1) Only if the training set is very large would you need to use a higher number of epochs. If the training set is relatively small, training with more epochs than the default limit might reduce performance;

(2)找到webui.py下的关键词:total_epoch(SoVITS模型上限)和total_epoch1Bb(GPT模型上限),提高同一行下的maximum参数。

(2) Find the keywords: total_epoch (SoVITS model limit) and total_epoch1Bb (GPT model limit) in webui.py, and increase the maximum parameter on the same line.