1 Star 20 Fork 9

sxlj / 基于Transformer的机器翻译实战

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

基于Transformer的机器翻译实战

环境配置为

  • torch1.6(GPU)
  • python3.8

运行结果为:

tgt_vocab 2537
> start train
  mask = torch.nonzero(target.data == self.padding_idx)
Epoch 0 Batch: 0 Loss: 7.869846 Tokens per Sec: 6.729871s
Epoch 0 Batch: 50 Loss: 6.973424 Tokens per Sec: 9.369555s
Epoch 0 Batch: 100 Loss: 6.151075 Tokens per Sec: 9.335995s
>> Evaluate
Epoch 0 Batch: 0 Loss: 5.811797 Tokens per Sec: 10.205480s
<<<<< Evaluate loss: 5.894540
****** Save model done... ******
Epoch 1 Batch: 0 Loss: 5.817729 Tokens per Sec: 9.177809s
Epoch 1 Batch: 50 Loss: 5.375205 Tokens per Sec: 9.392804s
Epoch 1 Batch: 100 Loss: 5.004299 Tokens per Sec: 9.312739s
>> Evaluate
Epoch 1 Batch: 0 Loss: 4.876106 Tokens per Sec: 10.117515s
<<<<< Evaluate loss: 4.963982
****** Save model done... ******
Epoch 2 Batch: 0 Loss: 4.972553 Tokens per Sec: 9.326845s
Epoch 2 Batch: 50 Loss: 4.716051 Tokens per Sec: 9.343461s
Epoch 2 Batch: 100 Loss: 4.421447 Tokens per Sec: 9.298382s
>> Evaluate
Epoch 2 Batch: 0 Loss: 4.322602 Tokens per Sec: 10.029434s
<<<<< Evaluate loss: 4.404554
****** Save model done... ******
Epoch 3 Batch: 0 Loss: 4.321961 Tokens per Sec: 9.205928s
Epoch 3 Batch: 50 Loss: 4.124991 Tokens per Sec: 9.310113s
Epoch 3 Batch: 100 Loss: 3.811377 Tokens per Sec: 9.258893s
>> Evaluate
Epoch 3 Batch: 0 Loss: 3.755665 Tokens per Sec: 9.987302s
<<<<< Evaluate loss: 3.840265
****** Save model done... ******
Epoch 4 Batch: 0 Loss: 3.806191 Tokens per Sec: 9.237126s
Epoch 4 Batch: 50 Loss: 3.649720 Tokens per Sec: 9.304432s
Epoch 4 Batch: 100 Loss: 3.349196 Tokens per Sec: 9.241163s
>> Evaluate
Epoch 4 Batch: 0 Loss: 3.337457 Tokens per Sec: 10.032519s
<<<<< Evaluate loss: 3.417614
****** Save model done... ******
Epoch 5 Batch: 0 Loss: 3.459634 Tokens per Sec: 9.176797s
Epoch 5 Batch: 50 Loss: 3.273624 Tokens per Sec: 9.308859s
Epoch 5 atch: 100 Loss: 2.998452 Tokens per Sec: 9.213062s
>> Evaluate
Epoch 5 Batch: 0 Loss: 2.960990 Tokens per Sec: 10.029608s
<<<<< Evaluate loss: 3.052980
****** Save model done... ******
Epoch 6 Batch: 0 Loss: 3.211699 Tokens per Sec: 9.185010s
Epoch 6 Batch: 50 Loss: 2.893345 Tokens per Sec: 9.238220s
Epoch 6 Batch: 100 Loss: 2.681960 Tokens per Sec: 9.188497s
>> Evaluate
Epoch 6 Batch: 0 Loss: 2.675523 Tokens per Sec: 9.989699s
<<<<< Evaluate loss: 2.747432
****** Save model done... ******
Epoch 7 Batch: 0 Loss: 2.950672 Tokens per Sec: 9.177439s
Epoch 7 Batch: 50 Loss: 2.645654 Tokens per Sec: 9.243345s
Epoch 7 Batch: 100 Loss: 2.393623 Tokens per Sec: 9.190061s
>> Evaluate
Epoch 7 Batch: 0 Loss: 2.380331 Tokens per Sec: 10.032775s
<<<<< Evaluate loss: 2.472470
****** Save model done... ******
Epoch 8 Batch: 0 Loss: 2.514863 Tokens per Sec: 9.237069s
Epoch 8 Batch: 50 Loss: 2.363517 Tokens per Sec: 9.244510s
Epoch 8 Batch: 100 Loss: 2.082449 Tokens per Sec: 9.155179s
>> Evaluate
Epoch 8 Batch: 0 Loss: 2.099774 Tokens per Sec: 10.031063s
<<<<< Evaluate loss: 2.180049
****** Save model done... ******
Epoch 9 Batch: 0 Loss: 2.320165 Tokens per Sec: 9.147264s
Epoch 9 Batch: 50 Loss: 2.092441 Tokens per Sec: 9.262332s
Epoch 9 Batch: 100 Loss: 1.818154 Tokens per Sec: 9.175017s
>> Evaluate
Epoch 9 Batch: 0 Loss: 1.842787 Tokens per Sec: 10.029404s
<<<<< Evaluate loss: 1.941446
****** Save model done... ******
Epoch 10 Batch: 0 Loss: 1.946543 Tokens per Sec: 8.972206s
Epoch 10 Batch: 50 Loss: 1.917192 Tokens per Sec: 9.136343s
Epoch 10 Batch: 100 Loss: 1.587525 Tokens per Sec: 9.199168s
>> Evaluate
Epoch 10 Batch: 0 Loss: 1.625920 Tokens per Sec: 9.988358s
<<<<< Evaluate loss: 1.738675
****** Save model done... ******
Epoch 11 Batch: 0 Loss: 1.732364 Tokens per Sec: 9.146457s
Epoch 11 Batch: 50 Loss: 1.695014 Tokens per Sec: 9.187964s
Epoch 11 Batch: 100 Loss: 1.454914 Tokens per Sec: 9.176912s
>> Evaluate
Epoch 11 Batch: 0 Loss: 1.514616 Tokens per Sec: 9.988358s
<<<<< Evaluate loss: 1.647388
****** Save model done... ******
Epoch 12 Batch: 0 Loss: 1.626071 Tokens per Sec: 9.059952s
Epoch 12 Batch: 50 Loss: 1.541598 Tokens per Sec: 9.212269s
Epoch 12 Batch: 100 Loss: 1.336604 Tokens per Sec: 9.202791s
>> Evaluate
Epoch 12 Batch: 0 Loss: 1.401758 Tokens per Sec: 9.862439s
<<<<< Evaluate loss: 1.507253
****** Save model done... ******
Epoch 13 Batch: 0 Loss: 1.420878 Tokens per Sec: 9.117863s
Epoch 13 Batch: 50 Loss: 1.440541 Tokens per Sec: 9.189840s
Epoch 13 Batch: 100 Loss: 1.132154 Tokens per Sec: 9.172929s
>> Evaluate
Epoch 13 Batch: 0 Loss: 1.207064 Tokens per Sec: 9.904041s
<<<<< Evaluate loss: 1.320485
****** Save model done... ******
Epoch 14 Batch: 0 Loss: 1.226669 Tokens per Sec: 9.118559s
Epoch 14 Batch: 50 Loss: 1.261432 Tokens per Sec: 9.169248s
Epoch 14 Batch: 100 Loss: 1.040873 Tokens per Sec: 9.119401s
>> Evaluate
Epoch 14 Batch: 0 Loss: 1.071591 Tokens per Sec: 9.821342s
<<<<< Evaluate loss: 1.176238
****** Save model done... ******
Epoch 15 Batch: 0 Loss: 1.086495 Tokens per Sec: 9.088573s
Epoch 15 Batch: 50 Loss: 1.159208 Tokens per Sec: 9.132641s
Epoch 15 Batch: 100 Loss: 0.943899 Tokens per Sec: 9.176910s
>> Evaluate
Epoch 15 Batch: 0 Loss: 0.962707 Tokens per Sec: 9.904061s
<<<<< Evaluate loss: 1.063912
****** Save model done... ******
Epoch 16 Batch: 0 Loss: 0.978801 Tokens per Sec: 9.002674s
Epoch 16 Batch: 50 Loss: 1.079826 Tokens per Sec: 9.086403s
Epoch 16 Batch: 100 Loss: 0.851728 Tokens per Sec: 9.041045s
>> Evaluate
Epoch 16 Batch: 0 Loss: 0.851689 Tokens per Sec: 9.241199s
<<<<< Evaluate loss: 0.968978
****** Save model done... ******
Epoch 17 Batch: 0 Loss: 0.919070 Tokens per Sec: 5.768479s
Epoch 17 Batch: 50 Loss: 1.026209 Tokens per Sec: 5.864358s
Epoch 17 Batch: 100 Loss: 0.791767 Tokens per Sec: 5.730444s
>> Evaluate
Epoch 17 Batch: 0 Loss: 0.779452 Tokens per Sec: 6.049646s
<<<<< Evaluate loss: 0.875268
****** Save model done... ******
Epoch 18 Batch: 0 Loss: 0.837940 Tokens per Sec: 5.839964s
Epoch 18 Batch: 50 Loss: 0.900170 Tokens per Sec: 5.841982s
Epoch 18 Batch: 100 Loss: 0.701193 Tokens per Sec: 5.800162s
>> Evaluate
Epoch 18 Batch: 0 Loss: 0.663479 Tokens per Sec: 6.065270s
<<<<< Evaluate loss: 0.769979
****** Save model done... ******
Epoch 19 Batch: 0 Loss: 0.751043 Tokens per Sec: 5.827926s
Epoch 19 Batch: 50 Loss: 0.829178 Tokens per Sec: 5.816878s
Epoch 19 Batch: 100 Loss: 0.635180 Tokens per Sec: 5.778595s
>> Evaluate
Epoch 19 Batch: 0 Loss: 0.614568 Tokens per Sec: 6.010801s
<<<<< Evaluate loss: 0.695584
****** Save model done... ******
<<<<<<< finished train, cost 418.6273 seconds
> start predict
BOS look around . EOS
BOS 四 处 看 看 。 EOS
translation: 继 续 看 。
BOS hurry up . EOS
BOS 赶 快 ! EOS
translation: 快 点 !
BOS keep trying . EOS
BOS 继 续 努 力 。 EOS
translation: 继 续 考 试 。
BOS take it . EOS
BOS 拿 走 吧 。 EOS
translation: 拿 走 。
BOS birds fly . EOS
BOS 鸟 类 飞 行 。 EOS
translation: 鸟 类 穿 鸟 类 。
BOS hurry up . EOS
BOS 快 点 ! EOS
translation: 快 点 !
BOS look there . EOS
BOS 看 那 里 。 EOS
translation: 看 那 边 。```

空文件

简介

基于Transformer的机器翻译实战 展开 收起
Python
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/lj857335332/Transformer.git
git@gitee.com:lj857335332/Transformer.git
lj857335332
Transformer
基于Transformer的机器翻译实战
master

搜索帮助