You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
![]() |
2 weeks ago | |
---|---|---|
data | 2 weeks ago | |
README.md | 2 weeks ago | |
run_glue.py | 2 weeks ago | |
run_train.sh | 2 weeks ago | |
加载数据.py | 2 weeks ago | |
合并数据.py | 2 weeks ago | |
批量测试结果.py | 2 weeks ago | |
数据分割.py | 2 weeks ago | |
数据处理.py | 2 weeks ago | |
测试paperred降aigc检测结果.py | 2 weeks ago | |
测试分割数据.py | 2 weeks ago | |
测试分词.py | 2 weeks ago | |
生成ABtest训练数据.py | 2 weeks ago | |
生成文本.py | 2 weeks ago | |
计算肉斤数.py | 2 weeks ago | |
读取mysql文件.py | 2 weeks ago | |
读取文件.py | 2 weeks ago |
README.md
训练脚本
bash bash run_train_2.sh
[INFO|trainer.py:2144] 2025-05-13 16:26:32,249 >> ***** Running training *****
[INFO|trainer.py:2145] 2025-05-13 16:26:32,249 >> Num examples = 2,699
[INFO|trainer.py:2146] 2025-05-13 16:26:32,249 >> Num Epochs = 5
[INFO|trainer.py:2147] 2025-05-13 16:26:32,249 >> Instantaneous batch size per device = 1
[INFO|trainer.py:2150] 2025-05-13 16:26:32,249 >> Total train batch size (w. parallel, distributed & accumulation) = 1
[INFO|trainer.py:2151] 2025-05-13 16:26:32,249 >> Gradient Accumulation steps = 1
[INFO|trainer.py:2152] 2025-05-13 16:26:32,249 >> Total optimization steps = 13,495
[INFO|trainer.py:2153] 2025-05-13 16:26:32,250 >> Number of trainable parameters = 105,023,236
{'loss': 1.6225, 'grad_norm': 0.6827912926673889, 'learning_rate': 1.925898480918859e-05, 'epoch': 0.19}
4%|████▏ | 500/13495 [03:34<1:33:16, 2.32it/s]
测试效果
python 批量测试结果.py